Return to RSS Feeds

ScottGu Blog

Azure: Premium Storage, RemoteApp, SQL Database Update, Live Media Streaming, Search and More

Today we released a number of great enhancements to Microsoft Azure. These include:

  • Premium Storage: New Premium high-performance Storage for Azure Virtual Machine workloads
  • RemoteApp: General Availability of Azure RemoteApp service
  • SQL Database: Enhancements to Azure SQL Databases
  • Media Services: General Availability of Live Channels for Media Streaming
  • Azure Search: Enhanced management experience, multi-language support and more
  • DocumentDB: Support for Bulk Add Documents and Query Syntax Highlighting
  • Site Recovery: General Availability of disaster recovery to Azure for branch offices and SMB customers
  • Azure Active Directory: General Availability of Azure Active Directory application proxy and password write back support

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Premium Storage: High-performance Storage for Virtual Machines

I’m excited to announce the public preview of our new Azure Premium Storage offering. With the introduction of the new Premium Storage option, Azure now offers two types of durable storage: Premium Storage and Standard Storage. Premium Storage stores data durably on Solid State Drives (SSDs) and provides high performance, low latency, disk storage with consistent performance delivery guarantees.

image

Premium Storage is ideal for I/O-sensitive workloads - and is great for database workloads hosted within Virtual Machines.  You can optionally attach several premium storage disks to a single VM, and support up to 32 TB of disk storage per Virtual Machine and drive more than 50,000 IOPS per VM at less than 1 millisecond latency for read operations. This provides a wickedly fast storage option that enables you to run even more workloads in the cloud.

Using Premium Storage, Azure now offers the ability to "lift-and-shift" more demanding enterprise applications to the cloud - including SQL Server, Dynamics AX, Dynamics CRM, Exchange Server, MySQL, Oracle Database, IBM DB2, and SAP Business Suite solutions.

Premium Storage is now available in public preview starting today. To sign up to use the Azure Premium Storage preview, visit the Azure Preview page.

Disk Sizes and Performance

Premium Storage disks provide up to 5,000 IOPS and 200 MB/sec throughput depending on the disk size. When you create a new premium storage disk you get the option to select the disk size and performance characteristics you want based on your application performance and storage capacity needs.  For the public preview, we are offering three Premium Storage disk configurations:

Disk Types

P10

P20

P30

Disk Size

128 GB

512 GB

1 TB

IOPS per Disk

500

2300

5000

Throughput per Disk

100 MB/sec

150 MB/sec

200 MB/sec

You can maximize the performance of your VMs by attaching multiple Premium Storage disks to them (up to the network bandwidth limit available to the VM for disk traffic). To learn the disk bandwidth available for each VM size, see the Virtual Machine and Cloud Service Sizes for Azure

Durability

Durability of data is of utmost importance for any persistent storage option. Azure customers have critical applications that depend on the persistence of their data and high tolerance against failures. Premium Storage keeps three replicas of data within the same region. In addition, you can also optionally create snapshots of your disks and copy those snapshots to a Standard GRS storage account - which enables you to maintain a geo-redundant snapshot of your data that is stored at least 400 miles away from your primary Azure region.

Learn More

You can learn more about Premium Storage disks here.  To sign up to use Premium Storage, go to the Azure Preview page, and sign up for Microsoft Azure Premium Storage service using your subscription.

RemoteApp: General Availability of Azure RemoteApp

I’m excited to announce the general availability of Azure RemoteApp. Using Azure RemoteApp, you can deploy Windows desktop applications in the cloud, and provide your users with an intuitive, high-fidelity, WAN-ready remote application experience.  Users can use the cloud-hosted Windows applications you enable on their phones, tablets, or PCs - including Windows, Mac, iOS and Android based devices.  We are delivering RemoteApp with a super competitive price - you can host your user's applications in the cloud for just $10/user/month.  With today’s release, Azure RemoteApp is backed by an SLA and supported by Microsoft Support, offering the full scalability and security of the Azure cloud.

Getting Started

Setting up RemoteApp is easy. In the Azure Management Portal, select NEW -> App Services -> RemoteApp -> Quick Create. Pick a name, region, select the scale configuration plan you want to use, pick one of the standard template images, and click OK. When you do this for the first time, your 30-day free trial will also start. This is a fully featured trial, available to all Azure customers.

image

A RemoteApp instance is an elastic, automatically scaled, collection of Windows Server VMs that are running the Remote Desktop Session Host role and host the applications. The VMs are all created based on the template image you provide. You can provide your own template image containing your custom apps, or use one of the standard template images we provide. One of these is for Office 365 ProPlus, which you can use if you have a subscription that contains the Office 365 ProPlus service:

image

Once enabled, your users can quickly and easily connect to the applications you host in Azure.  They can use Windows, Mac, iOS and Android devices to connect to the RemoteApp service - enabling you to use Azure to run your Windows desktop applications anywhere in the world, on any device.

Enabling Hybrid Applications

Many business-critical Windows applications rely on on-premises infrastructure such as identity and machine management, and require access to on-premises databases and resources. Azure RemoteApp provides a hybrid deployment model that supports all of these scenarios.

  • Hybrid Management: In a hybrid RemoteApp collection, the VMs which host your applications are joined to your AD domain. Therefore, you can use on-premises management tools like Group Policy, System Center, and many other enterprise management tools that rely on AD membership.

  • Federated Identity: You can use Azure Active Directory integrated with your on-premises AD and your users can logon with their familiar corporate identities. When the Windows applications starts, it is running in a fully domain-joined session, with the usual integrated authentication capabilities of a Windows domain.
  • Hybrid Networking: Windows applications in a hybrid RemoteApp collection can seamlessly access on-premises data and resources. This capability is built on Azure Virtual Networking with site-to-site VPN, providing cloud-premise virtual network connectivity. In the future, RemoteApp collections will support full range of Azure Networking capabilities, including ExpressRoute.

Performance and Scale Configurations

With today's general availability release, we are offering two scale configurations: BASIC and STANDARD.

  • BASIC is intended for lighter, task-worker use cases, such as a single productivity application or a data-entry frontend to a line of business system.
  • STANDARD is intended for typical productivity use cases such as using Outlook, Word, Excel and other knowledge worker line of business and productivity applications.

You can select the scale configuration for your RemoteApp collection while creating it. If you want to change it later, you can do so using the SCALE tab. Your applications and settings and your user data remain intact through this change.

image

Pricing

We are making the RemoteApp service available at a very attractive, affordable price.  You can host a line of business Windows application for as little as $10/user per month using the BASIC configuration.

At the STANDARD level, you can host your users’ entire productivity workspace for just $15/user per month.

Learn More

A variety of resources are available on the RemoteApp overview page. You can also download the client for your device and take a test drive. Finally, RDV Team blog discusses today’s new features in more detail.

SQL Databases: Now with SQL 2014 Features and Compatibility

Today we are making available a preview of the next-generation release of our Azure SQL Database service.  We announced some of the preview's new features earlier in November.  Today's release delivers near-complete SQL Server 2014 engine compatibility and even better performance.

Our internal benchmark tests (using over 600 million rows of data) show query performance improvements of around 5x with today's preview relative to our existing Premium Tier SQL Database offering and up to 100x performance improvements when using the new In-memory columnstore technology now supported with today's preview release.

Lots of great new features and improvements

Key highlights of today's preview include:

  • Better management of large databases. We now support heavier database workload management with parallel queries, table partitioning, online indexing, worry-free large index rebuilds with the previous 2GB size limit removed, and more alter database commands.

  • Support for more programmability capabilities: You can now build even more robust applications with CLR, T-SQL Windows functions, XML index, and change tracking support.

  • Up to 100x performance improvements with support for In-memory columnstore queries for data mart and analytic workloads.

  • Improved monitoring and troubleshooting: Extended Events (XEvents) and visibility into over 100 new table views via an expanded set of Database Management Views (DMVs).

  • New S3 performance level: Today's preview introduces a new pricing option for SQL Databases. The new "S3" performance tier delivers 100 DTU of performance (twice the DTU level of the existing S2 tier) and all of the features available in the Standard tier. It enables an even more cost effective way to run applications with higher performance needs.

Learn More and Get Started

You can read more about the enhancements in today's preview on the preview getting started page.  To use today's preview, you can navigate to the SETTINGS part on the SQL Database blade in the Azure Preview Portal and upgrade to use the preview.

image

Try our the new features and give us your feedback!

Media Services: General Availability of Live Media Streaming

I’m very excited to announce the General Availability of Live Channels Media Streaming support.  Live Channels with Azure Media Services is the live services backbone that broadcasters such as NBC Sports have used to deliver live online media streamed events such as English Premier League, NHL hockey, Sunday Night Football.  A dozen international broadcasters also used it to seamlessly deliver live media streaming coverage of the 2014 Sochi Winter Olympics and 2014 FIFA World Cup.

You can now use Live Channels to stream events of your own - and scale to literally millions of users watching them.  Today's general availability release is backed by an enterprise-grade Service-Level Agreement (SLA) for all customers. 

Live Streaming

Learn More

For more information on functionality and pricing, visit the Getting Started with Live Streaming blog post, the Media Services webpage or Media Services Pricing webpage, or the Live Streaming MSDN section.

Search: Portal Management, Multi-language support

I am happy to announce a number of highly requested features available today in Azure Search.  Azure Search provides developers with all of the features needed to build out search experience for web and mobile applications without having to deal with the typical complexities that come with managing, tuning and scaling a real-world search service. 

Azure Portal Enhancements

Helping developers setup and manage their services quickly and easily is a key goal of the Azure Management Portal. Today's release adds several new capabilities to the Azure Search support in the portal that makes it even easier to get started with Azure Search and reduce the need to write code.

For example, you can now easily create a new index. In the portal, you can name the search index, set all of the fields, and assign the properties for each of them - all without writing any code:

image

Once you create the index, you can also now drill into usage details like document counts and storage size. You can see all of the fields associated with this index as shown below:

image

Index tuning is another enhancement now supported in the portal user interface. Boosting relevant items not only provides a better search experience, it also helps you achieve business objectives. For example, if you are searching a product index, you might want to boost documents where the result was found in the product name, as opposed to another document where the result was found in the product description. Or you may wish to use a scoring function that allows you to boost items that have high star ratings or that provide higher margins.

The task of tuning an index was previously only available through the API. Starting today, using the Azure Preview portal you can create or alter scoring profiles, instantly tuning the results of your search queries without having to write a line of code:

image

Multilanguage Support across 27 Languages

With today’s release, Azure Search now has support for 27 languages. This allows Azure Search to accommodate the unique characteristics of a given language, enabling word-breaking and text normalization to work as expected for each language. Part of this enhancement includes support for stemming in the relevant languages, reducing words to their word stems. For example, you can search for the word “runs” and find documents that say “run” or “running”.

When creating an index you can choose to include content from multiple languages, allowing you to search and return results based on the chosen language of your user. For more information, you can visit the Language Support page. Over time, we will continue to enhance multi-language support to include additional languages.

API features

Last but not least, we’ve introduced a new Azure Search Management REST API that allows you to perform common administrative tasks, such as creating new services, and scaling services to allow for additional storage or higher query-per-second rates. You can see a sample of how to use this Management API at CodePlex.

Document DB: Bulk Add Documents and Syntax Highlighting

DocumentDB is a NoSQL document database service designed for scalable and high performance modern applications.  DocumentDB is delivered as a fully managed service (meaning you don’t have to manage any infrastructure or VMs yourself) with an enterprise grade SLA.

We now support some nice new capabilities for Document DB in the Azure Preview Portal:

  • Add Documents: Upload existing JSON documents via Document Explorer
  • Query syntax highlighting: Document DB SQL query

These features make it even easier to get started and explore DocumentDB.

Add Documents Support within the Azure Portal

The DocumentDB Document Explorer within the Azure Preview Portal now supports the uploading of existing JSON documents - which makes it easy to import and start using existing data stored in existing JSON files. Simply open Document Explorer and click the Add Document command:

image

In the new blade that opens, click the browse button to open a file explorer and select 1 or more JSON documents to upload. Note that Document Explorer currently supports up to 100 JSON document files in a single upload operation.

image

Once you’re satisfied with your selection, click the upload button. The documents will automatically be added to the Document Explorer grid and aggregate results will be displayed as the upload operation is in progress.

image

Once the operation has completed, you can select up to another 100 documents to upload without having to close the Add Document blade.  This makes it easy to import data into your DocumentDB databases.

Query Explorer – Syntax Highlighting

We’ve also enabled basic keyword and value highlighting within Query Explorer.

image

This makes it even easier to experiment and test queries against your NoSQL databases.

Please send us your feedback and suggestions on the Microsoft Azure DocumentDB feedback forum. If you haven’t tried DocumentDB yet, you can learn more about how to get started here.

Disaster Recovery: GA of DR for Branch Offices & SMB Customers

I’m excited to announce the General Availability of the Disaster Recovery (DR) to Azure for Branch offices and SMB feature in our Azure Site Recovery (ASR) service.  Today's new support enables consistent replication, protection, and recovery of Virtual Machines directly in Microsoft Azure.  With this new support we have extended the Azure Site Recovery service to become a simple, reliable & cost effective DR Solution for enabling Virtual Machine replication and recovery between Windows Server 2012 R2 and Microsoft Azure without having to deploy a System Center Virtual Machine Manager on your primary site.

These features builds on top of the Hyper-V Replica technology available in Windows Server 2012 R2 and Microsoft Azure to provide remote health monitoring, no-impact recovery plan testing and single click orchestrated recovery – all of this backed by an SLA that is enterprise grade.

Verify DR Plans with Confidence

The Test Failover feature within Azure Site Recovery allows you to test your disaster recovery plans without impacting your production workload which ensures that you can perform periodic DR drills to meet your compliance objectives. You can connect to the virtual machine running in Azure via RDP after enabling the appropriate endpoints for the virtual machine running in Azure.

A Planned Failover will do a shutdown of your on-premises machine, transfer all the last changes inside the virtual machine to Azure & then bring up an instance of the VM in Azure without any data loss. An Unplanned Failover is usually triggered when your on-premises site has been hit by an unexpected disaster.

If you are looking for failing over a multi-virtual machine application, you can do so using the One-Click Orchestration using Recovery Plans feature available in Azure Site Recovery. Recovery plans make failover and failback from Azure easy and ensure that you meet your Recovery Time Objectives (RTO) goals of your organization.

Check out additional pricing or product information about Azure Site Recovery, and sign up for a free Azure trial and start using it today.

Active Directory: GA of Application Proxy and Password Writeback support

Today's Azure update includes some great updates to Azure Active Directory.

Azure Active Directory Application Proxy

The Azure Active Directory Application Proxy allows you to make your on-premises web applications securely accessible to users who want to use them from the cloud - and enables you to authenticate access to them using Azure AD.

You can do this without changing your applications and without having to change your DMZ configuration. Just install a lightweight connector anywhere on your network and configure access to the application in your Azure Active Directory, and you can make your SharePoint, Outlook Web Access (or any other Web application that relies on Kerberos) available to users outside your corporate network.

image

With today's release we added support for Kerberos Constrained Delegation. Now, once a user has authenticated to Azure Active Directory, the Azure Active Directory Application Proxy can automatically authenticate users to your on-premises application.

Password Writeback for Azure Active Directory Premium Customers

With the new Password Writeback support in Azure AD Sync, you can now configure your Active Directory system so that any time a user or administrator changes a password in Azure AD, the new password is also written back to your on-premises Active Directory as well. So, for example, when a user forgets their password to your on-premises AD, they can reset their password using the Azure AD password reset service we provide in the cloud, and then use their new password to sign on to your on-premises AD.  This makes it easier for organizations to enable a variety of self-service IT and password reset features to their employees and partners.

Preview of security questions for password reset

With today's release we’re also introducing preview support that enables you to configure security questions for password reset scenarios. This enables users to register their answers to secret questions, and then use those answers to prove who they are when they go to reset a forgotten password.

Add your own password SSO for SaaS applications

With today's release we are introducing the preview of functionality that lets you configure password-based single sign-on for any web application that has an HTML sign-in page, even for applications that are not in the Azure AD Application Gallery. You can also add any links to your users’ Azure AD Access Panel, such as deep links to specific SharePoint pages, or to web apps that use Active Directory Federation Services.

More Ways to Get AD Premium

We now support the ability to purchase Azure Active Directory Premium online at the Office 365 commerce catalogue, where you can purchase Azure AD Premium licenses for as many users as you want.  You can then easily manage your Azure Active Directory by navigating to http://aka.ms/accessAAD or through the Office administration portal.

To learn more about these new capabilities and how you can start using them, read Alex Simons’ post on the Active Directory Team Blog.

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu omni

Announcing Open Source of .NET Core Framework, .NET Core Distribution for Linux/OSX, and Free Visual Studio Community Edition

This week we are holding our Connect() developer event in New York City.  This is an event that is being streamed online for free, and it covers some of the great new capabilities coming with the Visual Studio 2015 and .NET 5 releases.  You can watch the event live as well as on-demand here.

I just finished giving the opening keynote of the event during which I made several big announcements:

Announcing the Open Sourcing of the .NET Core Runtime and Libraries

Over the last several years we have integrated more and more open source technology into our .NET, Visual Studio, and Azure offerings.  We have also begun to open source more of our own code and technology as well.

Earlier this year, at the Build 2014 conference, I announced the creation of the .NET Foundation – which is an independent organization designed to foster the development and collaboration of open source technologies for .NET.  We have now open sourced ASP.NET, EF, Web API, NuGet and the "Roslyn" C# and VB compilers under it. 

It has been great to see the energy and innovation in these technologies since we made the open source announcements. We continue to have dedicated Microsoft teams working on each of them (several of the teams have more developers than ever before).  The open source process is now enabling the teams to collaborate even more with other developers in the community, and every single one of the above projects have now accepted code contributions from developers outside Microsoft.  The combination is enabling an even richer flow of ideas, and even better products.

Open Sourcing the .NET Core Runtime and Libraries

Today I’m excited to announce that we are going even further, and will be open sourcing the .NET Core Runtime.  This will include everything needed to execute .NET code – including the CLR, Just-In-Time Compiler (JIT), Garbage Collector (GC), and core .NET base class libraries.

We are releasing the source under the MIT open source license and are also issuing an explicit patent promise to clarify users patent rights to .NET.  This morning, we published the public repository on GitHub where the project will be hosted: https://github.com/dotnet/corefx

Today’s source release includes many of the newer core .NET framework libraries (ImmutableCollections, SIMD, XML and MetadataReader).  These libraries are fully open, and are ready to accept contributions.  Over the next several weeks and months we will continue to transfer source (including the Core CLR which is not there right now but in the process of being moved) into the repository and likewise make it open for contributions.

What does this open sourcing mean?

Today’s open source announcement means that developers will have a fully supported, fully open source, fully cross platform .NET stack for creating server and cloud applications – including everything from the C#/VB compilers, to the CLR runtime, to the core .NET base class libraries, to the higher-level .NET Web, Data and API frameworks.

It is an exciting day for .NET, and the new open source process will allow the .NET teams in Microsoft to collaborate even more deeply with other developers around the world.  The result is going to be even better products for everyone.

Announcing .NET Core Framework on Linux and OSX

Last month at a Cloud Event we held in San Francisco, Satya Nadella – our CEO – showed a slide like this one where he talked about how Microsoft loves Linux:

image

We’ve worked hard with Azure to make it a first-class cloud platform for Linux based applications, and shared how more than 20% of all VMs running on Azure are Linux based.  In fact, we now have 5 different Linux distributions officially supported for use on Azure – with full integration within our management portal and command-line extensibility.

Bringing Core .NET to Linux and OS X

Today I’m excited to announce the .NET side of our Linux support.  In addition to making the .NET server stack open-source, we are also going to release an official distribution of the .NET Core for Linux, as well as an official distribution of .NET Core for the Mac operation system as well.

This will enable you to build .NET server and cloud applications and run them on both Windows Server and Linux.  It is going to enable every developer – regardless of what operating system they use to develop or target – to use .NET. And to do so on a fully open source runtime.

We will be working closely with the Mono community as we complete our Linux port.  The Mono community have done a great job advancing .NET and Linux over the last decade.  Releasing the .NET Core source under an open source license is going to enable us to collaborate together much more closely going forward.  There are many Linux enhancements Mono has built that we would like to use, and likewise there are improvements Mono will be able to benefit from by being able to use the .NET source code.  Today’s set of announcements are a big win for everyone.

Announcing Visual Studio Community Edition

I’m also excited to announce that we are launching a new free edition of Visual Studio today that will empower even more developers to build great apps and solutions.

The new Visual Studio Community 2013 edition is a full-featured IDE.  It supports multiple project types in one solution file in a single IDE, and has all of the productivity features and IDE extensibility capabilities (meaning you can use Xamarin, ReSharper, VsVim, and any other VSIX extension) that developers love in Visual Studio.

It is now available completely free for:

  • Any individual developer working on a commercial or non-commercial project
  • Any developer contributing to an open source project
  • Anyone in an academic research or course setting (e.g. students, teachers, classroom, online course)
  • Any non-enterprise organization with 5 or fewer developers working on a commercial/non-commercial project together

We are making it available for download starting today, and developers can download and start using it immediately.  There is no program you need to join to use it – simply visit www.visualstudio.com, click the download button, and you are good to go. 

It is going to enable even more developers to take advantage of Visual Studio and build even better applications.  We are looking forward to seeing what you build with it.

Summary

It has never been a better time to be a software developer.  Software is what enables organizations to succeed in today’s digital environment.  It is what enables businesses to connect better with customers, to deliver amazing new experiences, to drive new revenue streams, and to run operations more efficiently.

Using the cloud, every software developer on the planet can now create and build solutions that can reach millions of users, with no upfront costs, powered by a cloud infrastructure that delivers completely global reach.  The impact an individual developer can now have has never been greater than it is today.

Our goal at Microsoft is to provide developers with the platform and tools that will make them incredibly successful.  It is a mission we have had since the very beginning of the company.  Today’s .NET open source, cross platform, and Visual Studio Community edition announcements will enable the development technology we build to be leveraged by an even wider range of developers.  We are really excited to see some of the new apps and solutions that are built with it.

In addition to the above announcements, we are also announcing and demoing tons of new features and services for the first time at our Connect() event streamed from New York.  You can watch the online presentations here.  Also read Soma’s blog post for a summary of some of the new VS 2015 and .NET 5 capabilities we announced this week.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: @scottgu omni

Azure: Announcing New Real-time Data Streaming and Data Factory Services

The last three weeks have been busy ones for Azure.  Two weeks ago we announced a partnership with Docker to enable great container-based development experiences on Linux, Windows Server and Microsoft Azure.

Last week we held our Cloud Day event and announced our new G-Series of Virtual Machines as well as Premium Storage offering.  The G-Series VMs provide the largest VM sizes available in the public cloud today (nearly 2x more memory than the largest AWS offering, and 4x more memory than the largest Google offering).  The new Premium Storage offering (which will work with both our D-series and G-series of VMs) will support up to 32TB of storage per VM, >50,000 IOPS of disk IO per VM, and enable sub-1ms read latency.  Combined they provide an enormous amount of power that enables you to run even bigger and better solutions in the cloud.

Earlier this week, we officially opened our new Azure Australia regions – which are our 18th and 19th Azure regions open for business around the world.  Then at TechEd Europe we announced another round of new features – including the launch of the new Azure MarketPlace, a bunch of great network improvements, our new Batch computing service, general availability of our Azure Automation service and more.

Today, I’m excited to blog about even more new services we have released this week in the Azure Data space.  These include:

  • Event Hubs: is a scalable service for ingesting and storing data from websites, client apps, and IoT sensors.
  • Stream Analytics: is a cost-effective event processing engine that helps uncover real-time insights from event streams.
  • Data Factory: enables better information production by orchestrating and managing diverse data and data movement.

Azure Event Hub is now available in general availability, and the new Azure Stream Analytics and Data Factory services are now in public preview.

Event Hubs: Log Millions of events per second in near real time

The Azure Event Hub service is a highly scalable telemetry ingestion service that can log millions of events per second in near real time.  You can use the Event Hub service to collect data/events from any IoT device, from any app (web, mobile, or a backend service), or via feeds like social networks.  We are using it internally within Microsoft to monitor some of our largest online systems.

Once you collect events with Event Hub you can then analyze the data using any real-time analytics system (like Apache Storm or our new Azure Stream Analytics service) and store/transform it into any data storage system (including HDInsight and Hadoop based solutions).

Event Hub is delivered as a managed service on Azure (meaning we run, scale and patch it for you and provide an enterprise SLA).  It delivers:

  • Ability to log millions of events per second in near real time
  • Elastic scaling support with the ability to scale-up/down with no interruption
  • Support for multiple protocols including support for HTTP and AMQP based events
  • Flexible authorization and throttling device policies
  • Time-based event buffering with event order preservation

The pricing model for Event Hubs is very flexible – for just $11/month you can provision a basic Event Hub with guaranteed performance capacity to capture 1 MB/sec of events sent to your Event Hub.  You can then provision as many additional capacity units as you need if your event traffic goes higher. 

Getting Started with Capturing Events

You can create a new Event Hub using the Azure Portal or via the command-line.  Choose New->App Service->Service Bus->Event Hub in the portal to do so:

image

Once created, events can be sent to an Event Hub with either a strongly-typed API (e.g. .NET or Java client library) or by just sending a raw HTTP or AMQP message to the service.  Below is a simple example of how easy it is to log an IoT event to an Event Hub using just a standard HTTP post request.  Notice the Authorization header in the HTTP post – you can use this to optionally enable flexible authentication/authorization for your devices:

POST https://your-namespace.servicebus.windows.net/your-event-hub/messages?timeout=60&api-version=2014-01 HTTP/1.1

Authorization: SharedAccessSignature sr=your-namespace.servicebus.windows.net&sig=tYu8qdH563Pc96Lky0SFs5PhbGnljF7mLYQwCZmk9M0%3d&se=1403736877&skn=RootManageSharedAccessKey

ContentType: application/atom+xml;type=entry;charset=utf-8

Host: your-namespace.servicebus.windows.net

Content-Length: 42

Expect: 100-continue

 

{ "DeviceId":"dev-01", "Temperature":"37.0" }

Your Event Hub can collect up to millions of messages per second like this, each storing whatever data schema you want within them, and the Event Hubs service will store them in-order for you to later read/consume.

Downstream Event Processing

Once you collect events, you no doubt want to do something with them.  Event Hubs includes an intelligent processing agent that allows for automatic partition management and load distribution across readers.  You can implement any logic you want within readers, and the data sent to the readers is delivered in the order it was sent to the Event Hub.

In addition to supporting the ability for you to write custom Event Readers, we also have two easy ways to work with pre-built stream processing systems: including our new Azure Stream Analytics Service and Apache Storm.  Our new Azure Stream Analytics service supports doing stream processing directly from Event Hubs, and Microsoft has created an Event Hubs Storm Spout for use with Apache Storm clusters.

The below diagram helps express some of the many rich ways you can use Event Hubs to collect and then hand-off events/data for processing:

image

Event Hubs provides a super flexible and cost effective building-block that you can use to collect and process any events or data you can stream to the cloud.  It is very cost effective, and provides the scalability you need to meet any needs.

Learning More about Event Hubs

For more information about Azure Event Hubs, please review the following resources:

Stream Analytics: Distributed Stream Processing Service for Azure

I’m excited to announce the preview our new Azure Stream Analytics service – a fully managed real-time distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. The new Azure Stream Analytics service easily scales from small projects with just a few KB/sec of throughput to a gigabyte/sec or more of streamed data messages/events.  

Our Stream Analytics pricing model enable you to run low throughput streaming workloads continuously at low cost, and enables you to only have to scale up as your business needs increase.  We do this while maintaining built in guarantees of event delivery, and state management for fast recovery which enables mission critical business continuity.

Dramatically Simpler Developer Experience for Stream Processing Data

Stream Analytics supports a SQL-like language that dramatically lowers the bar of the developer expertise required to create a scalable stream processing solution. A developer can simply write a few lines of SQL to do common operations including basic filtering, temporal analysis operations, joining multiple live streams of data with other static data sources, and detecting stream patterns (or lack thereof).

This dramatically reduces the complexity and time it takes to develop, maintain and apply time-sensitive computations on real-time streams of data. Most other streaming solutions available today require you to write complex custom code, but with Azure Stream Analytics you can write simple, declarative and familiar SQL.

Fully Managed Service that is Easy to Setup

With Stream Analytics you can dramatically accelerate how quickly you can derive valuable real time insights and analytics on data from devices, sensors, infrastructure, or applications. With a few clicks in the Azure Portal, you can create a streaming pipeline, configure its inputs and outputs, and provide SQL-like queries to describe the desired stream transformations/analysis you wish to do on the data. Once running, you are able to monitor the scale/speed of your overall streaming pipeline and make adjustments to achieve the desired throughput and latency.

You can create a new Stream Analytics Job in the Azure Portal, by choosing New->Data Services->Stream Analytics:

image

Setup Streaming Data Input

Once created, your first step will be to add a Streaming Data Input.  This allows you to indicate where the data you want to perform stream processing on is coming from.  From within the portal you can choose Inputs->Add An Input to launch a wizard that enables you to specify this:

image

We can use the Azure Event Hub Service to deliver us a stream of data to perform processing on. If you already have an Event Hub created, you can choose it from a list populated in the wizard above.  You will also be asked to specify the format that is being used to serialize incoming event in the Event Hub (e.g. JSON, CSV or Avro formats).

Setup Output Location

The next step to developing our Stream Analytics job is to add a Streaming Output Location.  This will configure where we want the output results of our stream processing pipeline to go.  We can choose to easily output the results to Blob Storage, another Event Hub, or a SQL Database:

image

Note that being able to use another Event Hub as a target provides a powerful way to connect multiple streams into an overall pipeline with multiple steps.

Write Streaming Queries

Now that we have our input and output sources configured, we can now write SQL queries to transform, aggregate and/or correlate the incoming input (or set of inputs in the event of multiple input sources) and output them to our output target.  We can do this within the portal by selecting the QUERY tab at the top.

image

There are a number of interesting queries you can write to processing the incoming stream of data.  For example, in the previous Event Hub section of this blog post I showed how you can use an HTTP POST command to submit JSON based temperature data from an IoT device to an Event Hub with data in JSON format like so:

{ "DeviceId":"dev-01", "Temperature":"37.0" }

When multiple devices are streaming events simultaneously into our Event Hub like this, it would feed into our Stream Analytics job as a stream of continuous data events that look like the sequence below:

Wouldn’t it be interesting to be able to analyze this data using a time-window perspective instead?  For example, it would be useful to calculate in real-time what the average temperature of each device was in the last 5 seconds of multiple readings.

With the Stream Analytics Service we can now dynamically calculate this over our incoming live stream of data just by writing a SQL query like so:

Running this query in our Stream Analytics job will aggregate/transform our incoming stream of data events and output data like below into the output source we configured for our job (e,g, a blog storage file or a SQL Database):

The great thing about this approach is that the data is being aggregated/transformed in real time as events are being streamed to us, and it scales to handle literally gigabytes of data event streamed per second.

Scaling your Stream Analytics Job

Once defined, you can easily monitor the activity of your Stream Analytics Jobs in the Azure Portal:

image

You can use the SCALE tab to dynamically increase or decrease scale capacity for your stream processing – allowing you to pay only for the compute capacity you need, and enabling you to handle jobs with gigabytes/sec of streamed data. 

Learning More about Stream Analytics Service

For more information about Stream Analytics, please review the following resources:

Data Factory: Fully managed service to build and manage information production pipelines

Organizations are increasingly looking to fully leverage all of the data available to their business.  As they do so, the data processing landscape is becoming more diverse than ever before – data is being processed across geographic locations, on-premises and cloud, across a wide variety of data types and sources (SQL, NoSQL, Hadoop, etc), and the volume of data needing to be processed is increasing exponentially. Developers today are often left writing large amounts of custom logic to deliver an information production system that can manage and co-ordinate all of this data and processing work.

To help make this process simpler, I’m excited to announce the preview of our new Azure Data Factory service – a fully managed service that makes it easy to compose data storage, processing, and data movement services into streamlined, scalable & reliable data production pipelines. Once a pipeline is deployed, Data Factory enables easy monitoring and management of it, greatly reducing operational costs. 

Easy to Get Started

The Azure Data Factory is a fully managed service. Getting started with Data Factory is simple. With a few clicks in the Azure preview portal, or via our command line operations, a developer can create a new data factory and link it to data and processing resources.  From the new Azure Marketplace in the Azure Preview Portal, choose Data + Analytics –> Data Factory to create a new instance in Azure:

image

Orchestrating Information Production Pipelines across multiple data sources

Data Factory makes it easy to coordinate and manage data sources from a variety of locations – including ones both in the cloud and on-premises.  Support for working with data on-premises inside SQL Server, as well as Azure Blob, Tables, HDInsight Hadoop systems and SQL Databases is included in this week’s preview release. 

Access to on-premises data is supported through a data management gateway that allows for easy configuration and management of secure connections to your on-premises SQL Servers.  Data Factory balances the scale & agility provided by the cloud, Hadoop and non-relational platforms, with the management & monitoring that enterprise systems require to enable information production in a hybrid environment.

Custom Data Processing Activities using Hive, Pig and C#

This week’s preview enables data processing using Hive, Pig and custom C# code activities.  Data Factory activities can be used to clean data, anonymize/mask critical data fields, and transform the data in a wide variety of complex ways.

The Hive and Pig activities can be run on an HDInsight cluster you create, or alternatively you can allow Data Factory to fully manage the Hadoop cluster lifecycle on your behalf.  Simply author your activities, combine them into a pipeline, set an execution schedule and you’re done – no manual Hadoop cluster setup or management required. 

Built-in Information Production Monitoring and Dashboarding

Data Factory also offers an up-to-the moment monitoring dashboard, which means you can deploy your data pipelines and immediately begin to view them as part of your monitoring dashboard.  Once you have created and deployed pipelines to your Data Factory you can quickly assess end-to-end data pipeline health, pinpoint issues, and take corrective action as needed.

Within the Azure Preview Portal, you get a visual layout of all of your pipelines and data inputs and outputs. You can see all the relationships and dependencies of your data pipelines across all of your sources so you always know where data is coming from and where it is going at a glance. We also provide you with a historical accounting of job execution, data production status, and system health in a single monitoring dashboard:

image

Learning More about Stream Analytics Service

For more information about Data Factory, please review the following resources:

Other Great Data Improvements

Today’s releases make it even easier for customers to stream, process and manage the movement of data in the cloud.  Over the last few months we’ve released a bunch of other great data updates as well that make Azure a great platform to perform any data needs.  Since August: 

We released a major update of our SQL Database service, which is a relational database as a service offering.  The new SQL DB editions (Basic/Standard/Premium ) support a 99.99% SLA, larger database sizes, dedicated performance guarantees, point-in-time recovery, new auditing features, and the ability to easily setup active geo-DR support. 

We released a preview of our new DocumentDB service, which is a fully-managed, highly-scalable, NoSQL Document Database service that supports saving and querying JSON based data.  It enables you to linearly scale your document store and scale to any application size.  Microsoft MSN portal recently was rewritten to use it – and stores more than 20TB of data within it.

We released our new Redis Cache service, which is a secure/dedicated Redis cache offering, managed as a service by Microsoft.  Redis is a popular open-source solution that enables high-performance data types, and our Redis Cache service enables you to standup an in-memory cache that can make the performance of any application much faster.

We released major updates to our HDInsight Hadoop service, which is a 100% Apache Hadoop-based service in the cloud. We have also added built-in support for using two popular frameworks in the Hadoop ecosystem: Apache HBase and Apache Storm.

We released a preview of our new Search-As-A-Service offering, which provides a managed search offering based on ElasticSearch that you can easily integrate into any Web or Mobile Application.  It enables you to build search experiences over any data your application uses (including data in SQLDB, DocDB, Hadoop and more).

And we have released a preview of our Machine Learning service, which provides a powerful cloud-based predictive analytics service.  It is designed for both new and experienced data scientists, includes 100s of algorithms from both the open source world and Microsoft Research, and supports writing ML solutions using the popular R open-source language.

You’ll continue to see major data improvements in the months ahead – we have an exciting roadmap of improvements ahead.

Summary

Today’s Microsoft Azure release enables some great new data scenarios, and makes building applications that work with data in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu omni

Azure: New Marketplace, Network Improvements, New Batch Service, Automation Service, more

Today we released a major set of updates to Microsoft Azure. Today’s updates include:

  • Marketplace: Announcing Azure Marketplace and partnerships with key technology partners
  • Networking: Network Security Groups, Multi-NIC, Forced Tunneling, Source IP Affinity, and much more
  • Batch Computing: Public Preview of the new Azure Batch Computing Service
  • Automation: General Availability of the Azure Automation Service
  • Anti-malware: General Availability of Microsoft Anti-malware for Virtual Machines and Cloud Services
  • Virtual Machines: General Availability of many more VM extensions – PowerShell DSC, Octopus, VS Release Management

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Marketplace: Announcing Azure Marketplace and partnerships with key technology partners

Last week, at our Cloud Day event in San Francisco, I announced a new Azure Marketplace that helps to better connect Azure customers with partners, ISVs and startups.  With just a couple of clicks, you can now quickly discover, purchase, and deploy any number of solutions directly into Azure.

Exploring the Marketplace

You can explore the Azure Marketplace by clicking the Marketplace title that is pinned by default to the home-screen of the Azure Preview Portal:

image

Clicking the Marketplace tile will enable you to explore a large selection of applications, VM images, and services that you can provision into your Azure subscription:

image

Using the marketplace provides a super easy way to take advantage of a rich ecosystem of applications and services integrated to run great with Azure.  Today’s marketplace release includes multi-VM templates to run Hadoop clusters powered by Cloudera or Hortenworks, Linux VMs powered by Unbuntu, CoreOS, Suse, CentOS, Microsoft SharePoint Server Farms, Cassandra Clusters powered by DataStax, and a wide range of security virtual appliances.

You can click any of the items in the gallery to learn more about them and optionally deploy them.  Doing so will walk you though a simple to follow creation wizard that enables you to optionally configure how/where they will run, as well as display any additional pricing required for the apps/services/VM images that you select.

For example, below is all it takes to stand-up an 8-node DataStax Enterprise cluster:

image

Solutions you purchase through the Marketplace will be automatically billed to your Azure subscription (avoiding the need for you to setup a separate payment method).  Virtual Machine images will support the ability to bring your own license or rent the image license by the hour (which is ideal for proof of concept solutions or cases where you need the solution for only a short period of time).  Both Azure Direct customers as well as customers who pay using an Enterprise Agreement can take advantage of the Azure Marketplace starting today.

You can learn more about the Azure Marketplace as well as browse the items within it here.

Networking: Lots and lots of New Features and Improvements

This week’s Azure update includes a ton of new capabilities to the Azure networking stack.  You can use these new networking capabilities immediately in the North Europe region, and they will be supported worldwide in all regions in November 2014.  The new network capabilities include:

Network Security Groups

You can now create Network Security groups to define access control rules for inbound and outbound traffic to a Virtual machine or a group of virtual machines in a subnet. The security groups and the rules can be managed and updated independent of the life cycle of the VM.

Multi-NIC Support

You can now create and manage multiple virtual network interfaces (NICs) on a VM.  Multi-NIC support is a fundamental requirement for a majority of network virtual appliances that can be deployed in Azure. Having this support now enabled within Azure will enable even richer network virtual appliances to be used.

Forced Tunneling

You can now redirect or “force” all Internet-bound traffic that originates in a cloud application back through an on-premises network via a Site-to-Site VPN tunnel for inspection and auditing. This is a critical security capability for enterprise grade applications.

ExpressRoute Enhancements

You can now share a single ExpressRoute connection across multiple Azure subscriptions. Additionally, a single Virtual Network in Azure can now be linked to more than one ExpressRoute circuit, thereby enabling much richer backup and disaster recovery scenarios.

image

New VPN Gateway Sizes

To cater to the growing hybrid connectivity throughput needs and the number of cross premise sites, we are announcing the availability of a higher performance Azure VPN gateway. This will enable a faster ExpressRoute and Site-to-Site VPN gateways with more tunnels.

Operations and audit logs for VNet Gateways and ExpressRoute

You can now view operations logs for Virtual Network Gateways and ExpressRoute circuits. The Azure portal will now show operations logs and information on all API calls you make and important infrastructure changes made such as scheduled updates to gateways.

Advanced Virtual Network Gateway policies

We now enable the ability for you to control encryption for the tunnel between Virtual Networks. You now have a choice between 3DES, AES128, AES256 and Null encryption, and you can also enable Perfect Forward Secrecy (PFS) for IPsec/IKE gateways.

Source IP Affinity

The Azure Load Balancer now supports a new distribution mode called Source IP Affinity (also known as session affinity or client IP affinity). You can now load balance traffic based on a 2-tuple (Source-IP, Destination-IP) or 3-tuple (Source-IP, Destination-IP and Protocol) distribution modes.

Nested policies for Traffic Manager

You can now create nested policies for traffic management. This allows tremendous flexibility in creating powerful load-balancing and failover schemes to support the needs of larger, more complex deployments.

Portal Support for Managing Internal Load Balancer, Reserved and Instance IP addresses for Virtual Machines

It is now possible to use the Azure Preview Portal to manage creating and setting up internal load balancers, as well as reserved and instance IP addresses for virtual machines.

Automation: General Availability of Azure Automation Service

I am excited to announce the General Availability of the Azure Automation service. Azure Automation enables the creation, deployment, monitoring, and maintenance of resources in an Azure environment using a highly scalable and reliable workflow engine. The service can be used to orchestrate time-consuming and frequently repeated operational tasks across Azure and third-party systems while decreasing operating expenses.

Azure Automation allows you to build runbooks (PowerShell Workflows) to describe your administration processes, provides a secure global assets store so you don’t need to hardcode sensitive information within your runbooks, and offers scheduling so that runbooks can be triggered automatically.

Runbooks can automate a wide range of scenarios – from simple day to day manual tasks to complex processes that span multiple Azure services and 3rd party systems. Because Automation is built on PowerShell, you can take advantage of the many existing PowerShell modules, or author your own to integrate with third party systems.

Creating and Editing Runbooks

You can create a runbook from scratch, or start by importing an existing template in the runbook gallery:

image

Editing experience for runbooks can also be performed directly in the administration portal:

image

Pricing

Available as a pay-as-you-go service, Automation is billed based on the number of job run time minutes used in a given Azure subscription.  500 minutes of free job runtime credits are also included each month for Azure customers to use at no charge.

Learn More

To learn more about Azure Automation, check out the following resources:

Batch Service: Preview of Azure Batch - new job scheduling service for parallel and HPC apps

I’m excited to announce the public preview of our new Azure Batch Service. This new platform service provides “job scheduling as a service” with auto-scaling of compute resources, making it easy to run large-scale parallel and high performance computing (HPC) work in Azure. You submit jobs, we start the VMs, run your tasks, handle any failures, and then shut things down as work completes.

Azure Batch is the job scheduling engine that we use internally to manage encoding for Azure Media Services, and for testing Azure itself. With this preview, we are excited to expand our SDK with a new application framework from GreenButton, a company Microsoft acquired earlier in the year. The Azure Batch SDK makes it easy to cloud-enable parallel, cluster, and HPC applications by describing jobs with the required resources, data, and one or more compute tasks.

Azure Batch can be used to run large volumes of similar tasks or applications in parallel, programmatically. A command line program or script takes a set of data files as input, processes the data in a series of tasks, and produces a set of output files. Examples of batch workloads that customers are running today in Azure include calculating risk for banks and insurance companies, designing new consumer and industrial products, sequencing genes and developing new drugs, searching for new energy sources, rendering 3D animations, and transcoding video.

Azure Batch makes it easy for these customers to use hundreds, thousands, tens of thousands of cores, or more on demand. With job scheduling as a service, Azure developers can focus on using batch computing in their applications and delivering services without needing to build and manage a work queue, scaling resources up and down efficiently, dispatching tasks, and handling failures.

image

The scale of Azure helps batch computing customers get their work done faster, experiment with different designs, run larger and more precise models, and test a large number of different scenarios without having to invest in and maintain large clusters.

Learn more about Azure Batch and start using it for your applications today.

Virtual Machines: General Availability of Microsoft Anti-Malware for VMs and Cloud Services

I’m excited to announce that the Microsoft Anti-malware security extension for Virtual Machines and Cloud Services is now generally available.  We are releasing it as a free capability that you can use at no additional charge.

The Microsoft Anti-malware security extension can be used to help identify and remove viruses, spyware or other malicious software.  It provides real-time protection from the latest threats and also supports on-demand scheduled scanning.  Enabling it is a good security best practice for applications hosted either on-premises or in the cloud.

Enabling the Anti-Malware Extension

You can select and configure the Microsoft Antimalware security extension for virtual machines using the Azure preview portal, Visual Studio or API’s/PowerShell.  Antimalware events are then logged to the customer configured Azure Storage account via Azure Diagnostics and can be piped to HDInsight or a SIEM tool for further analysis. More information is available in the Microsoft Antimalware Whitepaper.

To enable antimalware feature on existing virtual machine, select the EXTENSIONS tile on a Virtual Machine in the Azure Preview Portal, then click ADD in the command bar and select the Microsoft Antimalware extension. Then, click CREATE and customize any settings:

image

Virtual Machines: General Availability of even more VM Extensions

In addition to enabling the Microsoft Anti-Malware extension for Virtual Machines, today’s release also includes support for a whole bunch more new VM extensions that you can enable within your Virtual Machines.  These extensions can be added and configured using the same EXTENSIONS tile on Virtual Machine resources within the Azure Preview Portal (the same screen-shot as in the Anti-malware section above).

The new extensions enabled today include:

PowerShell Desired State Configuration

The PowerShell Desired State Configuration Extension can be used to deploy and configure Azure VMs using Desired State Configuration (DSC) technology. DSC enables you to declaratively specify how you want your software environment to be configured. DSC configuration can also be automated using the Azure PowerShell SDK, and you can push configurations to any Azure VM and have them enacted automatically. For more details, please see this desired state configuration blog post.

image 

Octopus

Octopus simplifies the deployment of ASP.NET web applications, Windows Services and other applications by automatically configuring IIS, installing services and making configuration changes. Octopus integration of Azure was one of the top requested features on Azure UserVoice and with this integration we will simplify the deployment and configuration of octopus on the VM.

image

Visual Studio Release Management

Release Management for Visual Studio is a continuous delivery solution that automates the release process through all of your environments from TFS through to production. Visual Studio Release Management is integrated with TFS and you can configure multi-stage release pipelines to automatically deploy and validate your applications on multiple environments. With the new Visual Studio Release Management extension, VMs can be preconfigured with the necessary components for required for Release Management to operate.

image

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu omni

Docker and Microsoft: Integrating Docker with Windows Server and Microsoft Azure

I’m excited to announce today that Microsoft is partnering with Docker, Inc to enable great container-based development experiences on Linux, Windows Server and Microsoft Azure.

Docker is an open platform that enables developers and administrators to build, ship, and run distributed applications. Consisting of Docker Engine, a lightweight runtime and packaging tool, and Docker Hub, a cloud service for sharing applications and automating workflows, Docker enables apps to be quickly assembled from components and eliminates the friction between development, QA, and production environments.

Earlier this year, Microsoft released support for Docker containers with Linux on Azure.  This support integrates with the Azure VM agent extensibility model and Azure command-line tools, and makes it easy to deploy the latest and greatest Docker Engine in Azure VMs and then deploy Docker based images within them.  

Docker Support for Windows Server + Docker Hub integration with Microsoft Azure

Today, I’m excited to announce that we are working with Docker, Inc to extend our support for Docker much further.  Specifically, I’m excited to announce that:

1) Microsoft and Docker are integrating the open-source Docker Engine with the next release of Windows Server.  This release of Windows Server will include new container isolation technology, and support running both .NET and other application types (Node.js, Java, C++, etc) within these containers.  Developers and organizations will be able to use Docker to create distributed, container-based applications for Windows Server that leverage the Docker ecosystem of users, applications and tools.  It will also enable a new class of distributed applications built with Docker that use Linux and Windows Server images together.

image 

2) We will support the Docker client natively on Windows.  Developers and administrators running Windows will be able to use the same standard Docker client and interface to deploy and manage Docker based solutions with both Linux and Windows Server environments.

image

 

3) Docker for Windows Server container images will be available in the Docker Hub alongside the Docker for Linux container images available today.  This will enable developers and administrators to easily share and automate application workflows using both Windows Server and Linux Docker images.

4) We will integrate Docker Hub with the Microsoft Azure Gallery and Azure Management Portal.  This will make it trivially easy to deploy and run both Linux and Windows Server based Docker images in Microsoft Azure.

5) Microsoft is contributing code to Docker’s Open Orchestration APIs.  These APIs provide a portable way to create multi-container Docker applications that can be deployed into any datacenter or cloud provider environment. This support will allow a developer or administrator using the Docker command line client to launch either Linux or Windows Server based Docker applications directly into Microsoft Azure from his or her development machine.

Exciting Opportunities Ahead

At Microsoft we continue to be inspired by technologies that can dramatically improve how quickly teams can bring new solutions to market. The partnership we are announcing with Docker today will enable developers and administrators to use the best container tools available for both Linux and Windows Server based applications, and to run all of these solutions within Microsoft Azure.  We are looking forward to seeing the great applications you build with them.

You can learn more about today’s announcements here and here.

Hope this helps,

Scott omni

Azure: Redis Cache, Disaster Recovery to Azure, Tagging Support, Elastic Scale for SQLDB, DocDB

Over the last few days we’ve released a number of great enhancements to Microsoft Azure.  These include:

  • Redis Cache: General Availability of Redis Cache Service
  • Site Recovery: General Availability of Disaster Recovery to Azure using Azure Site Recovery
  • Management: Tags support in the Azure Preview Portal
  • SQL DB: Public preview of Elastic Scale for Azure SQL Database (available through .NET lib, Azure service templates)
  • DocumentDB: Support for Document Explorer, Collection management and new metrics
  • Notification Hub: Support for Baidu Push Notification Service
  • Virtual Network: Support for static private IP support in the Azure Preview Portal
  • Automation updates: Active Directory authentication, PowerShell script converter, runbook gallery, hourly scheduling support

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Redis Cache: General Availability of Redis Cache Service

I’m excited to announce the General Availability of the Azure Redis Cache. The Azure Redis Cache service provides the ability for you to use a secure/dedicated Redis cache, managed as a service by Microsoft. The Azure Redis Cache is now the recommended distributed cache solution we advocate for Azure applications.

Redis Cache

Unlike traditional caches which deal only with key-value pairs, Redis is popular for its support of high performance data types, on which you can perform atomic operations such as appending to a string, incrementing the value in a hash, pushing to a list, computing set intersection, union and difference, or getting the member with highest ranking in a sorted set.  Other features include support for transactions, pub/sub, Lua scripting, keys with a limited time-to-live, and configuration settings to make Redis behave more like a traditional cache.

Finally, Redis has a healthy, vibrant open source ecosystem built around it. This is reflected in the diverse set of Redis clients available across multiple languages. This allows it to be used by nearly any application, running on either Windows or Linux, that you host inside of Azure.

Redis Cache Sizes and Editions

The Azure Redis Cache Service is today offered in the following sizes:  250 MB, 1 GB, 2.8 GB, 6 GB, 13 GB, 26 GB, 53 GB.  We plan to support even higher-memory options in the future.

Each Redis cache size option is also offered in two editions:

  • Basic – A single cache node, without a formal SLA, recommended for use in dev/test or non-critical workloads.
  • Standard – A multi-node, replicated cache configured in a two-node Master/Replica configuration for high-availability, and backed by an enterprise SLA.

With the Standard edition, we manage replication between the two nodes and perform an automatic failover in the case of any failure of the Master node (because of either an un-planned server failure, or in the event of planned patching maintenance). This helps ensure the availability of the cache and the data stored within it. 

Details on Azure Redis Cache pricing can be found on the Azure Cache pricing page.  Prices start as low as $17 a month.

Create a New Redis Cache and Connect to It

You can create a new instance of a Redis Cache using the Azure Preview Portal.  Simply select the New->Redis Cache item to create a new instance. 

You can then use a wide variety of programming languages and corresponding client packages to connect to the Redis Cache you’ve provisioned.  You use the same Redis client packages that you’d use to connect to your own Redis instance as you do to connect to an Azure Redis Cache service.  The API + libraries are exactly the same.

Below we’ll use a .NET Redis client called StackExchange.Redis to connect to our Azure Redis Cache instance. First open any Visual Studio project and add the StackExchange.Redis NuGet package to it, with the NuGet package manager.  Then, obtain the cache endpoint and key respectively from the Properties blade and the Keys blade for your cache instance within the Azure Preview Portal.

image

Once you’ve retrieved these, create a connection instance to the cache with the code below:

var connection = StackExchange.Redis.ConnectionMultiplexer.Connect("contoso5.redis.cache.windows.net,ssl=true,password=...");

Once the connection is established, retrieve a reference to the Redis cache database, by calling the ConnectionMultiplexer.GetDatabase method.

IDatabase cache = connection.GetDatabase();

Items can be stored in and retrieved from a cache by using the StringSet and StringGet methods (or their async counterparts – StringSetAsync and StringGetAsync).

cache.StringSet("Key1", "HelloWorld");

cache.StringGet("Key1");

You have now stored and retrieved a “Hello World” string from a Redis cache instance running on Azure. For an example of an end to end application using Azure Redis Cache, please check out the MVC Movie Application blog post.

Using Redis for ASP.NET Session State and Output Caching

You can also take advantage of Redis to store out-of-process ASP.NET Session State as well as to share Output Cached content across web server instances. 

For more details on using Redis for Session State, checkout this blog post: ASP.NET Session State for Redis

For details on using Redis for Output Caching, checkout this MSDN post: ASP.NET Output Cache for Redis

Monitoring and Alerting

Every Azure Redis cache instance has built-in monitoring support on by default. Currently you can track Cache Hits, Cache Misses, Get/Set Commands, Total Operations, Evicted Keys, Expired Keys, Used Memory, Used Bandwidth and Used CPU.  You can easily visualize these using the Azure Preview Portal:

image

You can also create alerts on metrics or events (just click the “Add Alert” button above). For example, you could create an alert rule to notify the cache administrator when the cache is seeing evictions. This in turn might signal that the cache is running hot and needs to be scaled up with more memory.

Learn more

For more information about the Azure Redis Cache, please visit the following links:

Site Recovery: Announcing the General Availability of Disaster Recovery to Azure

I’m excited to announce the general availability of the Azure Site Recovery Service’s new Disaster Recovery to Azure functionality.  The Disaster Recovery to Azure capability enables consistent replication, protection, and recovery of on-premises VMs to Microsoft Azure. With support for both Disaster Recovery and Migration to Azure, the Azure Site Recovery service now provides a simple, reliable, and cost-effective DR solution for enabling Virtual Machine replication and recovery between on-premises private clouds across different enterprise locations, or directly to the cloud with Azure.

This month’s release builds upon our recent InMage acquisition, and the integration of InMage Scout with Azure Site Recovery enables us to provide hybrid cloud business continuity solutions for any customer IT environment – regardless of whether it is Windows or Linux, running on physical servers or virtualized servers using Hyper-V, VMware or other virtualization solutions. Microsoft Azure is now the ideal destination for disaster recovery for virtually every enterprise server in the world.

In addition to enabling replication to and disaster recovery in Azure, the Azure Site Recovery service also enables the automated protection of VMs, remote health monitoring of them, no-impact disaster recovery plan testing, and single click orchestrated recovery - all backed by an enterprise-grade SLA. A new addition with this GA release is the ability to also invoke Azure Automation runbooks from within Azure Site Recovery Plans, enabling you to further automate your solutions.

image

Learn More about Azure Site Recovery

For more information on Azure Site Recovery, check out the recording of the Azure Site Recovery session at TechEd 2014 where we discussed the preview.  You can also visit the Azure Site Recovery forum on MSDN for additional information and to engage with the engineering team or other customers.

Once you’re ready to get started with Azure Site Recovery, check out additional pricing or product information, and sign up for a free Azure trial.

Beginning this month, Azure Backup and Azure Site Recovery will also be available in a convenient, and economical promotion offer available for purchase via a Microsoft Enterprise Agreement.  Each unit of the Azure Backup & Site Recovery annual subscription offer covers protection of a single instance to Azure with Site Recovery, as well as backup of data with Azure Backup.  You can contact your Microsoft Reseller or Microsoft representative for more information.

Management: Tag Support with Resources

I’m excited to announce the support of tags in the Azure management platform and in the Azure preview portal.

Tags provide an easy way to organize your Azure resources and resources groups, by allowing you to tag your resources with name/value pairs to further categorize and view resources across resource groups and across subscriptions.  For example, you could use tags to identify which of your resources are used for “production” versus “dev/test” – and enable easy filtering/searching of the resources based on which tag you were interested in – regardless of which application or resource group they were in. 

Using Tags

To get started with the new Tag support, browse to any resource or resource group in the Azure Preview Portal and click on the Tags tile on the resource.

image

On the Tags blade that appears, you'll see a list of any tags you've already applied. To add a new tag, simply specify a name and value and press enter. After you've added a few tags, you'll notice autocomplete options based on pre-existing tag names and values to better ensure a consistent taxonomy across your resources and to avoid common mistakes, like misspellings.

image

You can also use our command-line tools to tag resources as well.  Below is an example of using the Azure PowerShell module to quickly tag all of the resources in your Azure subscription:

image

Once you've tagged your resources and resource groups, you can view the full list of tags across all of your subscriptions using the Browse hub.

image

You can also “pin” tags to your Startboard for quick access.  This provides a really easy way to quickly jump to any resource in a tag you’ve pinned:

image 

SQL Databases: Public Preview of Elastic Scale Support

I am excited to announce the public preview of Elastic Scale for Azure SQL Database. Elastic Scale enables the data-tier of an application to scale out via industry-standard sharding practices, while significantly streamlining the development and management of your sharded cloud applications. The new capabilities are provided through .NET libraries and Azure service templates that are hosted in your own Azure subscription to manage your highly scalable applications. Elastic Scale implements the infrastructure aspects of sharding and thus allows you to instead focus on the business logic of your application.

Elastic Scale allows developers to establish a “contract” that defines where different slices of data reside across a collection of database instances.  This enables applications to easily and automatically direct transactions to the appropriate database (shard) and perform queries that cross many or all shards using simple extensions to the ADO.NET programming model. Elastic Scale also enables coordinated data movement between shards to split or merge ranges of data among different databases and satisfy common scenarios such as pulling a busy tenant into its own shard. 

image

We are also announcing the Federation Migration Utility which is available as part of the preview. This utility will help current SQL Database Federations customers migrate their Federations application to Elastic Scale without having to perform any data movement.

Get Started with the Elastic Scale preview today, and watch our Channel 9 video to learn more.

DocumentDB: Document Explorer, Collection management and new metrics

Last week we released a bunch of updates to the Azure DocumentDB service experience in the Azure Preview Portal. We continue to improve the developer and management experiences so you can be more productive and build great applications on DocumentDB. These improvements include:

  • Document Explorer: View and access JSON documents in your database account
  • Collection management: Easily add and delete collections
  • Database performance metrics and storage information: View performance metrics and storage consumed at a Database level
  • Collection performance metrics and storage information: View performance metrics and storage consumed at a Collection level
  • Support for Azure tags: Apply custom tags to DocumentDB Accounts

Document Explorer

Near the bottom of the DocumentDB Account, Database, and Collection blades, you’ll now find a new Developer Tools lens with a Document Explorer part.

image

This part provides you with a read-only document explorer experience. Select a database and collection within the Document Explorer and view documents within that collection.

image

Note that the Document Explorer will load up to the first 100 documents in the selected Collection. You can load additional documents (in batches of 100) by selecting the “Load more” option at the bottom of the Document Explorer blade. Future updates will expand Document Explorer functionality to enable document CRUD operations as well as the ability to filter documents.

Collection Management

The DocumentDB Database blade now allows you to quickly create a new Collection through the Add Collection command found on the top left of the Database blade.

image

Health Metrics

We’ve added a new Collection blade which exposes Collection level performance metrics and storage information. You can access this new blade by selecting a Collection from the list of Collections on the Database blade.

image

The Database and Collection level metrics are available via the Database and Collection blades.

image

image

As always, we’d love to hear from you about the DocumentDB features and experiences you would find most valuable within the Azure portal. You can submit your suggestions on the Microsoft Azure DocumentDB feedback forum.

Notification Hubs: support for Baidu Cloud Push

Azure Notification Hubs enable cross platform mobile push notifications for Android, iOS, Windows, Windows Phone, and Kindle devices. Thousands of customers now use Notification Hubs for instant cross platform broadcast, personalized notifications to dynamic segments of their mobile audience, or simply to reach individual customers of their mobile apps regardless which device they use.  Today I am excited to announce support for another mobile notifications platform, Baidu Cloud Push, which will help Notification Hubs customers reach the diverse family of Android devices in China. 

Delivering push notifications to Android devices in China is no easy task, due to a diverse set of app stores and push services. Pushing notifications to an Android device via Google Cloud Messaging Service (GCM) does not work, as most Android devices in China are not configured to use GCM.  To help app developers reach every Android device independent of which app store they’re configured with, Azure Notification Hubs now supports sending push notifications via the Baidu Cloud Push service.

To use Baidu from your Notification Hub, register your app with Baidu, and obtain the appropriate identifiers (UserId and ChannelId) for your application.

image

Then configure your Notification Hub within the Azure Management Portal with these identifiers:

image

For more details, follow the tutorial in English & Chinese. You can learn more about Push Notifications using Azure at the Notification Hubs dev center.

Virtual Machines: Instance-Level Public IPs generally available

Azure now supports the ability for you to assign public IP addresses to VMs and web or worker roles so they become directly addressable on the Internet - without having to map a virtual IP endpoint for access. With Instance-Level Public IPs, you can enable scenarios like running FTP servers in Azure and monitoring VMs directly using their IPs.

For more information, please visit the Instance-Level Public IP Addresses webpage.

Automation: Updates

Earlier this year, we introduced preview availability of Azure Automation, a service that allows you to automate the deployment, monitoring, and maintenance of your Azure resources. I am excited to announce several new features in Azure Automation:

  • Active Directory Authentication
  • PowerShell Script Converter
  • Runbook Gallery
  • Hourly Scheduling

Active Directory Authentication

We now offer an easier alternative to using certificates to authenticate from the Azure Automation service to your Azure environment. You can now authenticate to Azure using an Azure Active Directory organization identity which provides simple, credential-based authentication.

If you do not have an Active Directory user set up already, simply create a new user and provide the user with access to manage your Azure subscription. Once you have done this, create an Automation Asset with its credentials and reference the credential in your runbook. You need to do this setup only once and can then use the stored credentials going forward, greatly simplifying the number of steps that you need to take to start automating. You can read this blog to learn more about getting set up with Active Directory Authentication.

PowerShell Script Converter

Azure Automation now supports importing PowerShell scripts as runbooks. When a PowerShell script is imported that does not contain a single PowerShell Workflow, Automation will attempt to convert it from PowerShell script to PowerShell Workflow, and then create a runbook from the result. This allows the vast amount of PowerShell content and knowledge that exists today to be more easily leveraged in Azure Automation, despite the fact that Automation executes PowerShell Workflow and not PowerShell.

Runbook Gallery

The Runbook Gallery allows you to quickly discover Automation sample, utility, and scenario runbooks from within the Azure management portal. The Runbook Gallery consists of runbooks that can be used as is or with minor modification, and runbooks that can serve as examples of how to create your own runbooks. The Runbook Gallery features content not only by Microsoft, but also by active members of the Azure community. If you have created a runbook that you think other users may benefit from, you can share it with the community on Script Center and it will show up in the Gallery. If you are interested in learning more about the Runbook Gallery, this TechNet article describes how the Gallery works in more detail and provides information on how you can contribute.

You can access the Gallery from +New, and then selecting App Services > Automation > Runbook > From Gallery.

image

In the Gallery wizard, you can browse for runbooks by selecting the category in the left hand pane and then view the description of the selected runbook in the right pane. You can then preview the code and finally import the runbook into your personal space:

image

We will be adding the ability to expand the Gallery to include PowerShell scripts in the near future. These scripts will be converted to Workflows when they are imported to your Automation Account using the new PowerShell Script Converter. This means that you will have more content to choose from and a tool to help you get your PowerShell scripts running in Azure.

Hourly Scheduling

Based on popular request from our users, hourly scheduling is now available in Azure Automation. This feature allows you to schedule your runbook hourly or every X hours, making it that much easier to start runbooks at a regular frequency that is smaller than a day.

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu omni

New D-Series of Azure VMs with 60% Faster CPUs, More Memory and Local SSD Disks

Today I’m excited to announce that we just released a new set of VM sizes for Microsoft Azure. These VM sizes are now available to be used immediately by every Azure customer.

The new D-Series of VMs can be used with both Azure Virtual Machines and Azure Cloud Services.  In addition to offering faster vCPUs (approximately 60% faster than our A series) and more memory (up to 112 GB), the new VM sizes also all have a local SSD disk (up to 800 GB) to enable much faster IO reads and writes.

The new VM sizes available today include the following:

General Purpose D-Series VMs

Name vCores Memory (GB) Local SSD Disk (GB)
Standard_D1 1 3.5 50
Standard_D2 2 7 100
Standard_D3 4 14 200
Standard_D4 8 28 400

 

High Memory D-Series VMs

Name vCores Memory (GB) Local SSD Disk (GB)
Standard_D11 2 14 100
Standard_D12 4 28 200
Standard_D13 8 56 400
Standard_D14 16 112 800

For pricing information, please see Virtual Machine Pricing Details.

Local SSD Disk and SQL Server Buffer Pool Extensions

A temporary drive on the VMs (D:\ on Windows, /mnt or /mnt/resource on Linux) is mapped to the local SSDs exposed on the D-Service VMs, and provides a really good option for replicated storage workloads, like MongoDB, or for significantly increasing the performance of SQL Server 2014 by enabling its unique Buffer Pool Extensions (BPE) feature.

SQL Server 2014’s Buffer Pool Extensions allows you to extend the SQL Engine Buffer Pool with the memory of local SSD disks to significantly improve the performance of SQL workloads. The Buffer Pool is a global memory resource used to cache data pages for much faster read operations.  Without any code changes in your application, you can enable the buffer pool support with the SSDs of the D-Series VMs using a simple T-SQL query with just four lines:

ALTER SERVER CONFIGURATION
SET BUFFER POOL EXTENSION ON
SIZE = <size> [ KB | MB | GB ]
FILENAME = 'D:\SSDCACHE\EXAMPLE.BPE'

No code changes are required in your application, and all write operations will continue to be durably persisted in VM drives persisted in Azure Storage. More details on configuring and using BPE can be found here.

Start Using the D-Series VMs Today

You can start using the new D-Series VM sizes immediately.  They can be easily created and used via both the current Azure Management Portal as well as Preview Portal, as well as from the Azure management command-line/scripts/APIs.

To learn more about the D-Series please read this post which has even more details about them, as well as check out the Azure documentation center.

Hope this helps,

Scott

omni

Azure: SQL Databases, API Management, Media Services, Websites, Role Based Access Control and More

This week we released a major set of updates to Microsoft Azure. This week’s updates include:

  • SQL Databases: General Availability of Azure SQL Database Service Tiers
  • API Management: General Availability of our API Management Service
  • Media Services: Live Streaming, Content Protection, Faster and Cost Effective Encoding, and Media Indexer
  • Web Sites: Virtual Network integration, new scalable CMS with WordPress and updates to Web Site Backup in the Preview Portal
  • Role-based Access Control: Preview release of role-based access control for Azure Management operations
  • Alerting: General Availability of Azure Alerting and new alerts on events

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:  

SQL Databases: General Availability of Azure SQL Database Service Tiers

I’m happy to announce the General Availability of our new Azure SQL Database service tiers - Basic, Standard, and Premium.  The SQL Database service within Azure provides a compelling database-as-a-service offering that enables you to quickly innovate & stand up and run SQL databases without having to manage or operate VMs or infrastructure.

Today’s SQL Database Service Tiers all come with a 99.99% SLA, and databases can now grow up to 500GB in size.

Each SQL Database tier now guarantees a consistent performance level that you can depend on within your applications – avoiding the need to worry about “noisy neighbors” who might impact your performance from time to time.

Built-in point-in-time restore support now provides you with the ability to automatically re-create databases at a certain point of time (giving you much more backup flexibility and allowing you to restore to exactly the point before you accidentally did something bad to your data).

Built-in auditing support enables you to gain insight into events and changes that occur with the databases you host.

Built-in active geo-replication support, available with the premium tier, enables you to create up to 4 readable, secondary, databases in any Azure region.  When active geo-replication is enabled, we will ensure that all transactions committed to the database in your primary region are continuously replicated to the databases in the other regions as well:

image

One of the primary benefits of active geo-replication is that it provides application control over disaster recovery at a database level.  Having cross-region redundancy enables your applications to recover in the event of a disaster (e.g. a natural disaster, etc).  The new active geo-replication support enables you to initiate/control any failovers – allowing you to shift the primary database to any of your secondary regions:

image

This provides a robust business continuity offering, and enables you to run mission critical solutions in the cloud with confidence. 

More Flexible Pricing

SQL Databases are now billed on a per-hour basis – allowing you to quickly create and tear down databases, and dynamically scale up or down databases even more cost effectively.

Basic Tier databases support databases up to 2GB in size and cost $4.99 for a full month of use.  Standard Tier databases support 250GB databases and now start at $15/month (there are also higher performance standard tiers at $30/month and $75/month). Premium Tier databases support 500GB databases as well as the active geo-replication feature and now start at $465/month.

The below table provides a quick look at the different tiers and functionality:

image

This page provides more details on how to think about DTU performance with each of the above tiers, and provides benchmark details on the number of transactions supported by each of the above service tiers and performance levels.

During the preview, we’ve heard from some ISVs, which have a large number of databases with variable performance demands, that they need the flexibility to share DTU performance resources across multiple databases as opposed to managing tiers for databases individually.  For example, some SaaS ISVs may have a separate SQL database for each customer and as the activity of each database varies, they want to manage a pool of resources with a defined budget across these customer databases.  We are working to enable this scenario within the new service tiers in a future service update. If you are an ISV with a similar scenario, please click here to sign up to learn more.

Learn more about SQL Databases on Azure here.

API Management Service: General Availability Release

I’m excited to announce the General Availability of the Azure API Management Service.

In my last post I discussed how API Management enables customers to securely publish APIs to developers and accelerate partner adoption.  These APIs can be used from mobile and client applications (on any device) as well as other cloud and service based applications.

The API management service supports the ability to take any APIs you already have (either in the cloud or on-premises) and publish them for others to use.  The API Management service enables you to:

  • Throttle, rate limit and quota your APIs
  • Gain analytic insights on how your APIs are being used and by whom
  • Secure your APIs using OAuth or key-based access
  • Track the health of your APIs and quickly identify errors
  • Easily expose a developer portal for your APIs that provides documentation and test experiences to developers who want to use your APIs

Today’s General Availability provides a formal SLA for Standard tier services.  We also have a developer tier of the service that you can use, starting at just $49 per month.

OAuth support in the Developer Portal

The API Management service provides a developer console that enables a great on-boarding and interactive learning experience for developers who want to use your APIs.  The developer console enables you to easily expose documentation as well enable developers to try/test your APIs.

With this week’s GA release we are also adding support that enables API publishers to register their OAuth Authorization Servers for use in the console, which in turn allows developers to sign in with their own login credentials when interacting with your API - a critical feature for any API that supports OAuth. All normative authorization grant types are supported plus scopes and default scopes.

image

For more details on how to enable OAuth 2 support with API Management and integration in the new developer portal, check out this tutorial.

Click here to learn more about the API Management service and try it out for free.

Media Services: Live Streaming, DRM, Faster Cost Effective Encoding, and Media Indexer

This week we are excited to announce the public preview of Live Streaming and Content Protection support with Azure Media Services.

The same Internet scale streaming solution that leading international broadcasters used to live stream the 2014 Winter Olympic Games and 2014 FIFA World Cup to tens of millions of customers globally is now available in public preview to all Azure customers. This means you can now stream live events of any size with the same level of scalability, uptime, and reliability that was available to the Olympics and World Cup.

DRM Content Protection

This week Azure Media Services is also introducing a new Content Protection offering which features both static and dynamic encryption with first party PlayReady license delivery and an AES 128-bit key delivery service.  This makes it easy to DRM protect both your live and pre-recorded video assets – and have them be available for users to easily watch them on any device or platform (Windows, Mac, iOS, Android and more).

Faster and More Cost Effective Media Encoding

This week, we are also introducing faster media encoding speeds and more cost-effective billing. Our enhanced Azure Media Encoder is designed for premium media encoding and is billed based on output GBs. Our previous encoder was billed on both input + output GBs, so the shift to output only billing will result in a substantial price reduction for all of our customers.

To help you further optimize your encoding workflows, we’re introducing Basic, Standard, and Premium Encoding Reserved units, which give you more flexibility and allow you to tailor the encoding capability you pay for to the needs of your specific workflows.

Media Indexer

Additionally, I’m happy to announce the General Availability of Azure Media Indexer, a powerful, market differentiated content extraction service which can be used to enhance the searchability of audio and video files.  With Media Indexer you can automatically analyze your media files and index the audio and video content in them. You can learn more about it here.

More Media Partners

I’m also pleased to announce the addition this week of several media workflow partners and client players to our existing large set of media partners:

  • Azure Media Services and Telestream’s Wirecast are now fully integrated, including a built-in destination that makes its quick and easy to send content from Wirecast’s live streaming production software to Azure.
  • Similarly, Newtek’s Tricaster has also been integrated into the Azure platform, enabling customers to combine the high production value of Tricaster with the scalability and reliability of Azure Media Services.
  • Cires21 and Azure Media have paired up to help make monitoring the health of your live channels simple and easy, and the widely-used JW player is now fully integrated with Azure to enable you to quickly build video playback experiences across virtually all platforms.

Learn More

Visit the Azure Media Services site for more information and to get started for free.

Websites: Virtual Network Integration, new Scalable CMS with WordPress

This week we’ve also released a number of great updates to our Azure Websites service.

Virtual Network Integration

Starting this week you can now integrate your Azure Websites with Azure Virtual Networks. This support enables your Websites to access resources attached to your virtual networks.  For example: this means you can now have a Website directly connect to a database hosted in a non-public VM on a virtual network.  If your Virtual Network is connected to your on-premises network (using a Site-to-Site software VPN or ExpressRoute dedicated fiber VPN) you can also now have your Website connect to resources in your on-premises network as well.

The new Virtual Network support enables both TCP and UDP protocols and will work with your VNET DNS. Hybrid Connections and Virtual Network are compatible such that you can also mix both in the same Website.  The new virtual network support for Web Sites is being released this week in preview.  Standard web hosting plans can have up to 5 virtual networks enabled. A website can only be connected to one virtual network at a time but there is no restriction on the number of websites that can be connected to a virtual network.

You can configure a Website to use a Virtual Network using the new Preview Azure Portal (http://portal.azure.com).  Click the “Virtual Network” tile in your web-site to bring up a virtual network blade that you can use to either create a new virtual network or attach to an existing one you already have:

image

Note that an Azure Website requires that your Virtual Network has a configured gateway and Point-to-Site enabled. It will remained grayed out in the UI above until you have enabled this.

Scalable CMS with WordPress

This week we also released support for a Scalable CMS solution with WordPress running on Azure Websites.  Scalable CMS with WordPress provides the fastest way to build an optimized and hassle free WordPress Website. It is architected so that your WordPress site loads fast and can support millions of page views a month, and you can easily scale up or scale out as your traffic increases.

It is pre-configured to use Azure Storage, which can be used to store your site’s media library content, and can be easily configured to use the Azure CDN.  Every Scalable CMS site comes with auto-scale, staged publishing, SSL, custom domains, Webjobs, and backup and restore features of Azure Websites enabled. Scalable WordPress also allows you to use Jetpack to supercharge your WordPress site with powerful features available to WordPress.com users.

You can now easily deploy Scalable CMS with WordPress solutions on Azure via the Azure Gallery integrated within the new Azure Preview Portal (http://portal.azure.com).  When you select it within the portal it will walk you through automatically setting up and deploying a complete solution on Azure:

image

Scalable WordPress is ideal for Web developers, creative agencies, businesses and enterprises wanting a turn-key solution that maximizes performance of running WordPress on Azure Websites.  It’s fast, simple and secure WordPress hosting on Azure Websites.

Updates to Website Backup

This week we also updated our built-in Backup feature within Azure Websites with a number of nice enhancements.  Starting today, you can now:

  • Choose the exact destination of your backups, including the specific Storage account and blob container you wish to store your backups within.
  • Choose to backup SQL databases or MySQL databases that are declared in the connection strings of the website.
  • On the restore side, you can now restore to both a new site, and to a deployment slot on a site. This makes it possible to verify your backup before you make it live.

These new capabilities make it easier than ever to have a full history of your website and its associated data.

Security: Role Based Access Control for Management of Azure

As organizations move more and more of their workloads to Azure, one of the most requested features has been the ability to control which cloud resources different employees can access and what actions they can perform on those resources.

Today, I’m excited to announce the preview release of Role Based Access Control (RBAC) support in the Azure platform. RBAC is now available in the Azure preview portal and can be used to control access in the portal or access to the Azure Resource Manager APIs. You can use this support to limit the access of users and groups by assigning them roles on Azure resources. Highlights include:

  • A subscription is no longer the access management boundary in Azure. In April, we introduced Resource Groups, a container to group resources that share lifecycle. Now, you can grant users access on a resource group as well as on individual resources like specific Websites or VMs.
  • You can now grant access to both users groups. RBAC is based on Azure Active Directory, so if your organization already uses groups in Azure Active Directory or Windows Server Active Directory for access management, you will be able to manage access to Azure the same way.

Below are some more details on how this works and can be enabled.

Azure Active Directory

Azure Active Directory is our directory service in the cloud.  You can create organizational tenants within Azure Active Directory and define users and groups within it – without having to have any existing Active Directory setup on-premises.

Alternatively, you can also sync (or federate) users and groups from your existing on-premises Active Directory to Azure Active Directory, and have your existing users and groups automatically be available for use in the cloud with Azure, Office 365, as well as over 2000 other SaaS based applications:

image

All users that access your Azure subscriptions, are now present in the Azure Active Directory, to which the subscription is associated. This enables you to manage what they can do as well as revoke their access to all Azure subscriptions by disabling their account in the directory.

Role Permissions

In this first preview we are pre-defining three built-in Azure roles that give you a choice of granting restricted access:

  • A Owner can perform all management operations for a resource and its child resources including access management.
  • A Contributor can perform all management operations for a resource including create and delete resources. A contributor cannot grant access to others.
  • A Reader has read-only access to a resource and its child resources. A Reader cannot read secrets.

In the RBAC model, users who have been configured to be the service administrator and co-administrators of an Azure subscription are mapped as belonging to the Owners role of the subscription. Their access to both the current and preview management portals remains unchanged.

Additional users and groups that you then assign to the new RBAC roles will only have those permissions, and also will only be able to manage Azure resources using the new Azure preview portal and Azure Resource Manager APIs.  RBAC is not supported in the current Azure management portal or via older management APIs (since neither of these were built with the concept of role based security built-in).

Restricting Access based on Role Based Permissions

Let’s assume that your team is using Azure for development, as well as to host the production instance of your application. When doing this you might want to separate the resources employed in development and testing from the production resources using Resource Groups.

You might want to allow everyone in your team to have a read-only view of all resources in your Azure subscription – including the ability to read and review production analytics data. You might then want to only allow certain users to have write/contributor access to the production resources.  Let’s look at how to set this up:

Step 1: Setting up Roles at the Subscription Level

We’ll begin by mapping some users to roles at the subscription level.  These will then by default be inherited by all resources and resource groups within our Azure subscription.

To set this up, open the Billing blade within the Preview Azure Portal (http://portal.azure.com), and within the Billing blade select the Azure subscription that you wish to setup roles for: 

image

Then scroll down within the blade of subscription you opened, and locate the Roles tile within it:

image

Clicking the Roles title will bring up a blade that lists the pre-defined roles we provide by default (Owner, Contributor, Reader).  You can click any of the roles to bring up a list of the users assigned to the role.  Clicking the Add button will then allow you to search your Azure Active Directory and add either a user or group to that role. 

Below I’ve opened up the default Reader role and added David and Fred to it:

image

Once we do this, David and Fred will be able to log into the Preview Azure Portal and will have read-only access to the resources contained within our subscription.  They will not be able to edit any changes, though, nor be able to see secrets (passwords, etc).

Note that in addition to adding users and groups from within your directory, you can also use the Invite button above to invite users who are not currently part of your directory, but who have a Microsoft Account (e.g. scott@outlook.com), to also be mapped into a role.

Step 2: Setting up Roles at the Resource Level

Once you’ve defined the default role mappings at the subscription level, they will by default apply to all resources and resource groups contained within it. 

If you wish to scope permissions even further at just an individual resource (e.g. a VM or Website or Database) or at a resource group level (e.g. an entire application and all resources within it), you can also open up the individual resource/resource-group blade and use the Roles tile within it to further specify permissions.

For example, earlier we granted David reader role access to all resources within our Azure subscription.  Let’s now grant him contributor role access to just an individual VM within the subscription.  Once we do this he’ll be able to stop/start the VM as well as make changes to it.

To enable this, I’ve opened up the blade for the VM below.  I’ve then scrolled down the blade and found the Roles tile within the VM.  Clicking the contributor role within the Roles tile will then bring up a blade that allows me to configure which users will be contributors (meaning have read and modify permissions) for this particular VM.  Notice below how I’ve added David to this:

image

Using this resource/resource-group level approach enables you to have really fine-grained access control permissions on your resources.

Command Line and API Access for Azure Role Based Access Control

The enforcement of the access policies that you configure using RBAC is done by the Azure Resource Manager APIs.  Both the Azure preview portal as well as the command line tools we ship use the Resource Manager APIs to execute management operations. This ensures that access is consistently enforced regardless of what tools are used to manage Azure resources.

With this week’s release we’ve included a number of new Powershell APIs that enable you to automate setting up as well as controlling role based access.

Learn More about Role Based Access

Today’s Role Based Access Control Preview provides a lot more flexibility in how you manage the security of your Azure resources.  It is easy to setup and configure.  And because it integrates with Azure Active Directory, you can easily sync/federate it to also integrate with the existing Active Directory configuration you might already have in your on-premises environment.

Getting started with the new Azure Role Based Access Control support is as simple as assigning the appropriate users and groups to roles on your Azure subscription or individual resources. You can read more detailed information on the concepts and capabilities of RBAC here. Your feedback on the preview features is critical for all improvements and new capabilities coming in this space, so please try out the new features and provide us your feedback.

Alerts: General Availability of Azure Alerting and new Alerts on Events support

I’m excited to announce the release of Azure Alerting to General Availability. Azure alerts supports the ability to create alert thresholds on metrics that you are interested in, and then have Azure automatically send an email notification when that threshold is crossed. As part of the general availability release, we are removing the 10 alert rule cap per subscription.

Alerts are available in the full azure portal by clicking Management Services in the left navigation bar:

image

Also, alerting is available on most of the resources in the Azure preview portal:

image

You can create alerts on metrics from 8 different services today (and we are adding more all the time):

  • Cloud Services
  • Virtual Machines
  • Websites
  • Web hosting plans
  • Storage accounts
  • SQL databases
  • Redis Cache
  • DocumentDB accounts

In addition to general availability for alerts on metrics, we are also previewing the ability to create alerts on operational events. This means you can get an email if someone stops your website, if your virtual machines are deleted, or if your Azure Resource Manager template deployment failed. Like alerts on metrics, you can route these alerts to the service and co-administrators, or, to a custom email address you provide.  You can configure these events on a resource in the Azure Preview Portal.  We have enabled this within the Portal for Websites – we’ll be extending it to all resources in the future.

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Azure: New DocumentDB NoSQL Service, New Search Service, New SQL AlwaysOn VM Template, and more

Today we released a major set of updates to Microsoft Azure. Today’s updates include:

  • DocumentDB: Preview of a New NoSQL Document Service for Azure
  • Search: Preview of a New Search-as-a-Service offering for Azure
  • Virtual Machines: Portal support for SQL Server AlwaysOn + community-driven VMs
  • Web Sites: Support for Web Jobs and Web Site processes in the Preview Portal
  • Azure Insights: General Availability of Microsoft Azure Monitoring Services Management Library
  • API Management: Support for API Management REST APIs

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

DocumentDB: Announcing a New NoSQL Document Service for Azure

I’m excited to announce the preview of our new DocumentDB service - a NoSQL document database service designed for scalable and high performance modern applications.  DocumentDB is delivered as a fully managed service (meaning you don’t have to manage any infrastructure or VMs yourself) with an enterprise grade SLA.

As a NoSQL store, DocumentDB is truly schema-free. It allows you to store and query any JSON document, regardless of schema. The service provides built-in automatic indexing support – which means you can write JSON documents to the store and immediately query them using a familiar document oriented SQL query grammar. You can optionally extend the query grammar to perform service side evaluation of user defined functions (UDFs) written in server-side JavaScript as well. 

DocumentDB is designed to linearly scale to meet the needs of your application. The DocumentDB service is purchased in capacity units, each offering a reservation of high performance storage and dedicated performance throughput. Capacity units can be easily added or removed via the Azure portal or REST based management API based on your scale needs. This allows you to elastically scale databases in fine grained increments with predictable performance and no application downtime simply by increasing or decreasing capacity units.

Over the last year, we have used DocumentDB internally within Microsoft for several high-profile services.  We now have DocumentDB databases that are each 100s of TBs in size, each processing millions of complex DocumentDB queries per day, with predictable performance of low single digit ms latency.  DocumentDB provides a great way to scale applications and solutions like this to an incredible size.

DocumentDB also enables you to tune performance further by customizing the index policies and consistency levels you want for a particular application or scenario, making it an incredibly flexible and powerful data service for your applications.   For queries and read operations, DocumentDB offers four distinct consistency levels - Strong, Bounded Staleness, Session, and Eventual. These consistency levels allow you to make sound tradeoffs between consistency and performance. Each consistency level is backed by a predictable performance level ensuring you can achieve reliable results for your application.

DocumentDB has made a significant bet on ubiquitous formats like JSON, HTTP and REST – which makes it easy to start taking advantage of from any Web or Mobile applications.  With today’s release we are also distributing .NET, Node.js, JavaScript and Python SDKs.  The service can also be accessed through RESTful HTTP interfaces and is simple to manage through the Azure preview portal.

Provisioning a DocumentDB account

To get started with DocumentDB you provision a new database account. To do this, use the new Azure Preview Portal (http://portal.azure.com), click the Azure gallery and select the Data, storage, cache + backup category, and locate the DocumentDB gallery item.

image

Once you select the DocumentDB item, choose the Create command to bring up the Create blade for it.

In the create blade, specify the name of the service you wish to create, the amount of capacity you wish to scale your DocumentDB instance to, and the location around the world that you want to deploy it (e.g. the West US Azure region):

image

Once provisioning is complete, you can start to manage your DocumentDB account by clicking the new instance icon on your Azure portal dashboard. 

image

The keys tile can be used to retrieve the security keys to use to access the DocumentDB service programmatically.

Developing with DocumentDB

DocumentDB provides a number of different ways to program against it. You can use the REST API directly over HTTPS, or you can choose from either the .NET, Node.js, JavaScript or Python client SDKs.

The JSON data I am going to use for this example are two families:

// AndersonFamily.json file

{

    "id": "AndersenFamily",

    "lastName": "Andersen",

    "parents": [

        { "firstName": "Thomas" },

        { "firstName": "Mary Kay" }

    ],

    "children": [

        { "firstName": "John", "gender": "male", "grade": 7 }

    ],

    "pets": [

        { "givenName": "Fluffy" }

    ],

    "address": { "country": "USA", "state": "WA", "city": "Seattle" }

}

and

// WakefieldFamily.json file

{

    "id": "WakefieldFamily",

    "parents": [

        { "familyName": "Wakefield", "givenName": "Robin" },

        { "familyName": "Miller", "givenName": "Ben" }

    ],

    "children": [

        {

            "familyName": "Wakefield",

            "givenName": "Jesse",

            "gender": "female",

            "grade": 1

        },

        {

            "familyName": "Miller",

            "givenName": "Lisa",

            "gender": "female",

            "grade": 8

        }

    ],

    "pets": [

        { "givenName": "Goofy" },

        { "givenName": "Shadow" }

    ],

    "address": { "country": "USA", "state": "NY", "county": "Manhattan", "city": "NY" }

}

Using the NuGet package manager in Visual Studio, I can search for and install the DocumentDB .NET package into any .NET application. With the URI and Authentication Keys for the DocumentDB service that I retrieved earlier from the Azure Management portal, I can then connect to the DocumentDB service I just provisioned, create a Database, create a Collection, Insert some JSON documents and immediately start querying for them:

using (client = new DocumentClient(new Uri(endpoint), authKey))

{

    var database = new Database { Id = "ScottsDemoDB" };

    database = await client.CreateDatabaseAsync(database);

 

    var collection = new DocumentCollection { Id = "Families" };

    collection = await client.CreateDocumentCollectionAsync(database.SelfLink, collection);

 

    //DocumentDB supports strongly typed POCO objects and also dynamic objects

    dynamic andersonFamily =  JsonConvert.DeserializeObject(File.ReadAllText(@".\Data\AndersonFamily.json"));

    dynamic wakefieldFamily = JsonConvert.DeserializeObject(File.ReadAllText(@".\Data\WakefieldFamily.json"));

 

    //persist the documents in DocumentDB

    await client.CreateDocumentAsync(collection.SelfLink, andersonFamily);

    await client.CreateDocumentAsync(collection.SelfLink, wakefieldFamily);

 

    //very simple query returning the full JSON document matching a simple WHERE clause

    var query = client.CreateDocumentQuery(collection.SelfLink, "SELECT * FROM Families f WHERE f.id = 'AndersenFamily'");

    var family = query.AsEnumerable().FirstOrDefault();

 

    Console.WriteLine("The Anderson family have the following pets:");              

    foreach (var pet in family.pets)

    {

        Console.WriteLine(pet.givenName);

    }

 

    //select JUST the child record out of the Family record where the child's gender is male

    query = client.CreateDocumentQuery(collection.DocumentsLink, "SELECT * FROM c IN Families.children WHERE c.gender='male'");

    var child = query.AsEnumerable().FirstOrDefault();

 

    Console.WriteLine("The Andersons have a son named {0} in grade {1} ", child.firstName, child.grade);

 

    //cleanup test database

    await client.DeleteDatabaseAsync(database.SelfLink);

}

As you can see above – the .NET API for DocumentDB fully supports the .NET async pattern, which makes it ideal for use with applications you want to scale well. 

Server-side JavaScript Stored Procedures

If I wanted to perform some updates affecting multiple documents within a transaction, I can define a stored procedure using JavaScript that swapped pets between families. In this scenario it would be important to ensure that one family didn’t end up with all the pets and another ended up with none due to something unexpected happening. Therefore if an error occurred during the swap process, it would be crucial that the database rollback the transaction and leave things in a consistent state.  I can do this with the following stored procedure that I run within the DocumentDB service:

function SwapPets(family1Id, family2Id) {

    var context = getContext();

    var collection = context.getCollection();

    var response = context.getResponse();

 

    collection.queryDocuments(collection.getSelfLink(), 'SELECT * FROM Families f where f.id  = "' + family1Id + '"', {},

    function (err, documents, responseOptions) {

        var family1 = documents[0];

 

        collection.queryDocuments(collection.getSelfLink(), 'SELECT * FROM Families f where f.id = "' + family2Id + '"', {},

        function (err2, documents2, responseOptions2) {

            var family2 = documents2[0];

                   

            var itemSave = family1.pets;

            family1.pets = family2.pets;

            family2.pets = itemSave;

 

            collection.replaceDocument(family1._self, family1,

                function (err, docReplaced) {

                    collection.replaceDocument(family2._self, family2, {});

                });

 

            response.setBody(true);

        });

    });

}

 

If an exception is thrown in the JavaScript function due to for instance a concurrency violation when updating a record, the transaction is reversed and system is returned to the state it was in before the function began.

It’s easy to register the stored procedure in code like below (for example: in a deployment script or app startup code):

    //register a stored procedure

    StoredProcedure storedProcedure = new StoredProcedure

    {

        Id = "SwapPets",

        Body = File.ReadAllText(@".\JS\SwapPets.js")

    };

               

    storedProcedure = await client.CreateStoredProcedureAsync(collection.SelfLink, storedProcedure);

 

And just as easy to execute the stored procedure from within your application:

    //execute stored procedure passing in the two family documents involved in the pet swap              

    dynamic result = await client.ExecuteStoredProcedureAsync<dynamic>(storedProcedure.SelfLink, "AndersenFamily", "WakefieldFamily");

If we checked the pets now linked to the Anderson Family we’d see they have been swapped.

Learning More

It’s really easy to get started with DocumentDB and create a simple working application in a couple of minutes.  The above was but one simple example of how to start using it.  Because DocumentDB is schema-less you can use it with literally any JSON document.  Because it performs automatic indexing on every JSON document stored within it, you get screaming performance when querying those JSON documents later. Because it scales linearly with consistent performance, it is ideal for applications you think might get large.

You can learn more about DocumentDB from the new DocumentDB development center here.

Search: Announcing preview of new Search as a Service for Azure

I’m excited to announce the preview of our new Azure Search service.  Azure Search makes it easy for developers to add great search experiences to any web or mobile application.   

Azure Search provides developers with all of the features needed to build out their search experience without having to deal with the typical complexities that come with managing, tuning and scaling a real-world search service.  It is delivered as a fully managed service with an enterprise grade SLA.  We also are releasing a Free tier of the service today that enables you to use it with small-scale solutions on Azure at no cost.

Provisioning a Search Service

To get started, let’s create a new search service.  In the Azure Preview Portal (http://portal.azure.com), navigate to the Azure Gallery, and choose the Data storage, cache + backup category, and locate the Azure Search gallery item.

image

Locate the “Search” service icon and select Create to create an instance of the service:

image

You can choose from two Pricing Tier options: Standard which provides dedicated capacity for your search service, and a Free option that allows every Azure subscription to get a free small search service in a shared environment.

The standard tier can be easily scaled up or down and provides dedicated capacity guarantees to ensure that search performance is predictable for your application.  It also supports the ability to index 10s of millions of documents with lots of indexes.

The free tier is limited to 10,000 documents, up to 3 indexes and has no dedicated capacity guarantees. However it is also totally free, and also provides a great way to learn and experiment with all of the features of Azure Search.

Managing your Azure Search service

After provisioning your Search service, you will land in the Search blade within the portal - which allows you to manage the service, view usage data and tune the performance of the service:

image

I can click on the Scale tile above to bring up the details of the number of resources allocated to my search service. If I had created a Standard search service, I could use this to increase the number of replicas allocated to my service to support more searches per second (or to provide higher availability) and the number of partitions to give me support for higher numbers of documents within my search service.

Creating a Search Index

Now that the search service is created, I need to create a search index that will hold the documents (data) that will be searched. To get started, I need two pieces of information from the Azure Portal, the service URL to access my Azure Search service (accessed via the Properties tile) and the Admin Key to authenticate against the service (accessed via the Keys title).

image

Using this search service URL and admin key, I can start using the search service APIs to create an index and later upload data and issue search requests. I will be sending HTTP requests against the API using that key, so I’ll setup a .NET HttpClient object to do this as follows:

HttpClient client = new HttpClient();

client.DefaultRequestHeaders.Add("api-key", "19F1BACDCD154F4D3918504CBF24CA1F");

I’ll start by creating the search index. In this case I want an index I can use to search for contacts in my dataset, so I want searchable fields for their names and tags; I also want to track the last contact date (so I can filter or sort on that later on) and their address as a lat/long location so I can use it in filters as well. To make things easy I will be using JSON.NET (to do this, add the NuGet package to your VS project) to serialize objects to JSON.

var index = new

{

    name = "contacts",

    fields = new[]

    {

        new { name = "id", type = "Edm.String", key = true },

        new { name = "fullname", type = "Edm.String", key = false },

        new { name = "tags", type = "Collection(Edm.String)", key = false },

        new { name = "lastcontacted", type = "Edm.DateTimeOffset", key = false },

        new { name = "worklocation", type = "Edm.GeographyPoint", key = false },

    }

};

 

var response = client.PostAsync("https://scottgu-dev.search.windows.net/indexes/?api-version=2014-07-31-Preview",

                                new StringContent(JsonConvert.SerializeObject(index), Encoding.UTF8, "application/json")).Result;

response.EnsureSuccessStatusCode();

You can run this code as part of your deployment code or as part of application initialization.

Populating a Search Index

Azure Search uses a push API for indexing data. You can call this API with batches of up to 1000 documents to be indexed at a time. Since it’s your code that pushes data into the index, the original data may be anywhere: in a SQL Database in Azure, DocumentDb database, blob/table storage, etc.  You can even populate it with data stored on-premises or in a non-Azure cloud provider.

Note that indexing is rarely a one-time operation. You will probably have an initial set of data to load from your data source, but then you will want to push new documents as well as update and delete existing ones. If you use Azure Websites, this is a natural scenario for Webjobs that can run your indexing code regularly in the background.

Regardless of where you host it, the code to index data needs to pull data from the source and push it into Azure Search. In the example below I’m just making up data, but you can see how I could be using the result of a SQL or LINQ query or anything that produces a set of objects that match the index fields we identified above.

var batch = new

{

    value = new[]

    {

        new

        {

            id = "221",

            fullname = "Jay Adams",

            tags = new string[] { "work" },

            lastcontacted = DateTimeOffset.UtcNow,

            worklocation = new

            {

                type = "Point",

                coordinates = new [] { -122.131577, 47.678581 }

            }

        },

        new

        {

            id = "714",

            fullname = "Catherine Abel",

            tags = new string[] { "work", "personal" },

            lastcontacted = DateTimeOffset.UtcNow,

            worklocation = new

            {

                type = "Point",

                coordinates = new [] { -121.825579, 47.1419814}

            }

        }

    }

};

 

var response = client.PostAsync("https://scottgu-dev.search.windows.net/indexes/contacts/docs/index?api-version=2014-07-31-Preview",

                                new StringContent(JsonConvert.SerializeObject(batch), Encoding.UTF8, "application/json")).Result;

response.EnsureSuccessStatusCode();

Searching an Index

After creating an index and populating it with data, I can now issue search requests against the index. Searches are simple HTTP GET requests against the index, and responses contain the data we originally uploaded as well as accompanying scoring information.

I can do a simple search by executing the code below, where searchText is a string containing the user input, something like abel work for example:

var response = client.GetAsync("https://scottgu-dev.search.windows.net/indexes/contacts/docs?api-version=2014-07-31-Preview&search=" + Uri.EscapeDataString(searchText)).Result;

response.EnsureSuccessStatusCode();

 

dynamic results = JsonConvert.DeserializeObject(response.Content.ReadAsStringAsync().Result);

 

foreach (var result in results.value)

{

    Console.WriteLine("FullName:" + result.fullname + " score:" + (double)result["@search.score"]);

}

Learning More

The above is just a simple scenario of what you can do.  There are a lot of other things we could do with searches. For example, I can use query string options to filter, sort, project and page over the results. I can use hit-highlighting and faceting to create a richer way to navigate results and suggestions to implement auto-complete within my web or mobile UI.

In this example, I used the default ranking model, which uses statistics of the indexed text and search string to compute scores. You can also author your own scoring profiles that model scores in ways that match the needs of your application.

Check out the Azure Search documentation for more details on how to get started, and some of the more advanced use-cases you can take advantage of.  With the free tier now available at no cost to every Azure subscriber, there is no longer any reason not to have Search fully integrated within your applications.

Virtual Machines: Support for SQL Server AlwaysOn, VM Depot images

Last month we added support for managing VMs within the Azure Preview Portal (http://portal.azure.com).  We also released built-in portal support that enables you to easily create multi-VM SharePoint Server Farms as well as a slew of additional Azure Certified VM images.  You can learn more about these updates in my last blog post.

Today, I’m excited to announce new support for automatically deploying SQL Server VMs with AlwaysOn configured, as well as integrated portal support for community supported VM Depot images.

SQL Server AlwaysOn Template

AlwaysOn Availability Groups, released in SQL Server 2012 and enhanced in SQL Server 2014, guarantee high availability for mission-critical workloads. Last year we started supporting SQL Availability Groups on Azure Infrastructure Services. In such a configuration, two SQL replicas (primary and secondary), each in its own Azure VM, are configured for automatic failover, and a listener (DNS name) is configured for client connectivity. Other components required are a file share witness to guarantee quorum in the configuration to avoid “split brain” scenarios, and a domain controller to join all VMs to the same domain. The SQL as well as the domain controller replicas are each deployed to an availability set to ensure they are in different Azure failure and upgrade domains.

Prior to today’s release, setting up the Availability Group configuration could be tedious and time consuming. We have dramatically simplified this experience through a new SQL Server AlwaysOn template in the Azure Gallery. This template fully automates the configuration of a highly available SQL Server deployment on Azure Infrastructure Services using an Availability Group.

You can find the template by navigating to the Azure Gallery within the Azure Preview Portal (http://portal.azure.com), selecting the Virtual Machine category on the left and selecting the SQL Server 2014 AlwaysOn gallery item. In the gallery details page, select Create. All you need is to provide some basic configuration information such as the administrator credentials for the VMs and the rest of the settings are defaulted for you. You may consider changing the defaults for Listener name as this is what your applications will use to connect to SQL Server.

image

Upon creation, 5 VMs are created in the resource group: 2 VMs for the SQL Server replicas, 2 VMs for the Domain Controller replicas, and 1 VM for the file share witness.

Once created, you can RDP to one of the SQL Server VMs to see the Availability Group configuration as depicted below:

image

Try out the SQL Server AlwaysOn template in the Azure Preview Portal today and give us your feedback!

VM Depot in Azure Gallery

Community-driven VM Depot images have been supported on the Azure platform for a couple of years now. But prior to today’s release they weren’t fully integrated into the mainline user experience.

Today, I’m excited to announce that we have integrated community VMs  into the Azure Preview Portal and the Azure gallery. With this release, you will find close to 300 pre-configured Virtual Machine images for Microsoft Azure.

Using these images, fully functional Virtual Machines can be deployed in the Preview Portal in minutes and customized for specific use cases. Starting from base operating system distributions (such as Debian, Ubuntu, CentOS, Suse and FreeBSD) through developer stacks (such as LAMP, Ruby on Rails, Node and Django), to complete applications (such as Wordpress, Drupal and Apache Solr), there is something for everyone in VM Depot.

Try out the VM Depot images in the Azure gallery from within the Virtual Machine category.

image

Web Sites: WebJobs and Process Management in the Preview Portal

Starting with today’s Azure release, Web Site WebJobs are now supported in the Azure Preview Portal.  You can also now drill into your Web Sites and monitor the health of any processes running within them (both to host your web code as well as your web jobs).

Web Site WebJobs

Using WebJobs, you can now now run any code within your Azure Web Sites – and do so in a way that is readily parallelizable, globally scalable, and complete with remote debugging, full VS support and an optional SDK to facilitate authoring. For more information about the power of WebJobs, visit Azure WebJobs recommended resources.

With today’s Azure release, we now support two types of Webjobs: on Demand and Continuous.  To use WebJobs in the preview portal, navigate to your web site and select the WebJobs tile within the Web Site blade. Notice that the part also now shows the count of WebJobs available.

image

By drilling into the title, you can view existing WebJobs as well as create new OnDemand or Continuous WebJobs. Scheduled WebJobs are not yet supported in the preview portal, but expect to see this in the near future.

Web Site Processes

I’m excited to announce a new feature in the Azure Web Sites experience in the Preview Portal - Websites Processes. Using Websites Process you can enumerate the different instances of your site, browse through the different processes on each instance, and even drill down to the handles and modules associated with each process. You can then check for detailed information like version, language and more.

image

In addition, you also get rich monitoring for CPU, Working Set and Thread count at the process level.  Just like with Task Manager for Windows, data collection begins when you open the Websites Processes blade, and stops when you close it.

image

This feature is especially useful when your site has been scaled out and is misbehaving in some specific instances but not in others. You can quickly identify runaway processes, find open file handles, and even kill a specific process instance.

Monitoring and Management SDK: Programmatic Access to Monitoring Data

The Azure Management Portal provides built-in monitoring and management support that makes it easy for you to track the health of your applications and solutions deployed within Azure.

If you want to programmatically access monitoring and management features in Azure, you can also now use our .NET SDK from Nuget. We are releasing this SDK to general availability today, so you can now use it for your production services!

For example, if you want to build your own custom dashboard that shows metric data from across your services, you can get that metric data via the SDK:

// Create the metrics client by obtain the certificate with the specified thumbprint.

MetricsClient metricsClient = new MetricsClient(new CertificateCloudCredentials(SubscriptionId, GetStoreCertificate(Thumbprint)));

 

// Build the resource ID string.

string resourceId = ResourceIdBuilder.BuildWebSiteResourceId("webtest-group-WestUSwebspace", "webtests-site");

 

// Get the metric definitions.

MetricDefinitionCollection metricDefinitions = metricsClient.MetricDefinitions.List(resourceId, null, null).MetricDefinitionCollection;

 

// Display the available metric definitions.

Console.WriteLine("Choose metrics (comma separated) to list:");

int count = 0;

foreach (MetricDefinition metricDefinition in metricDefinitions.Value)

{

    Console.WriteLine(count + ":" + metricDefinition.DisplayName);

    count++;

}

 

// Ask the user which metrics they are interested in.

var desiredMetrics = Console.ReadLine().Split(',').Select(x =>  metricDefinitions.Value.ToArray()[Convert.ToInt32(x.Trim())]);

 

// Get the metric values for the last 20 minutes.

MetricValueSetCollection values = metricsClient.MetricValues.List(

    resourceId,

    desiredMetrics.Select(x => x.Name).ToList(),

    "",

    desiredMetrics.First().MetricAvailabilities.Select(x => x.TimeGrain).Min(),

    DateTime.UtcNow - TimeSpan.FromMinutes(20),

    DateTime.UtcNow

).MetricValueSetCollection;

 

// Display the metric values to the user.

foreach (MetricValueSet valueSet in values.Value )

{

    Console.WriteLine(valueSet.DisplayName + " for the past 20 minutes:");

    foreach (MetricValue metricValue in valueSet.MetricValues)

    {

        Console.WriteLine(metricValue.Timestamp + "\t" + metricValue.Average);

    }

}

 

Console.Write("Press any key to continue:");

Console.ReadKey();

We support metrics for a variety of services with the monitoring SDK:

Service

Typical metrics

Frequencies

Cloud services

CPU, Network, Disk

5 min, 1 hr, 12 hrs

Virtual machines

CPU, Network, Disk

5 min, 1 hr, 12 hrs

Websites

Requests, Errors, Memory, Response time, Data out

1 min, 1 hr

Mobile Services

API Calls, Data Out, SQL performance

1 hr

Storage

Requests, Success rate, End2End latency

1 min, 1 hr

Service Bus

Messages, Errors, Queue length, Requests

5 min

HDInsight

Containers, Apps running

15 min

If you’d like to manage advanced autoscale settings that aren’t possible to do in the Portal, you can also do that via the SDK. For example, you can construct autoscale based on custom metrics – you can autoscale by anything that is returned from MetricDefinitions.

All of the documentation on the SDK is available on MSDN.

API Management: Support for Services REST API

We launched the Azure API Management service into preview in May of this year.  The API Management service enables  customers to quickly and securely publish APIs to partners, the public development community, and even internal developers.

Today, I’m excited to announce the availability of the API Management REST API which opens up a large number of new scenarios. It can be used to manage APIs, products, subscriptions, users and groups in addition to accessing your API analytics. In fact, virtually any management operation available in the Management API Portal is now accessible programmatically - opening up a host of integration and automation scenarios, including directly monetizing an API with your commerce provider of choice, taking over user or subscription management, automating API deployments and more.

We've even provided an additional SAS (Shared Access Signature) security option. An integrated experience in the publisher portal allows you to generate SAS tokens - so securely calling your API service couldn’t be easier. In just three easy steps:

  1. Enable the API on the System Settings page on the Publisher Portal
  2. Acquire a time-limited access token either manually or programmatically
  3. Start sending requests to the API, providing the token with every request

image 

See the REST API reference for full details.

Delegation of user registration and product subscription

The new API Management REST API makes it easy to automate and integrate other processes with API management. Many customers integrating in this way already have a user account system and would prefer to use this existing resource, instead of the built-in functionality provided by the Developer Portal. This feature, called Delegation, enables your existing website or backend to own the user data, manage subscriptions and seamlessly integrate with API Management's dynamically generated API documentation.

image

It's easy to enable Delegation: in the Publisher Portal navigate to the Delegation section and enable Delegated Sign-in and Sign up, provide the endpoint URL and validation key and you're good to go. For more details, check out the how-to guide.

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at:twitter.com/scottgu omni

Azure: Virtual Machine, Machine Learning, IoT Event Ingestion, Mobile, SQL, Redis, SDK Improvements

This past month we’ve released a number of great enhancements to Microsoft Azure.  These include:

  • Virtual Machines: Preview Portal Support as well as SharePoint Farm Creation
  • Machine Learning: Public preview of the new Azure Machine Learning service
  • Event Hub: Public preview of new Azure Event Ingestion Service
  • Mobile Services: General Availability of .NET support, SignalR support
  • Notification Hubs: Price Reductions and New Features
  • SQL Database: New Geo-Restore, Geo-Replication and Auditing support
  • Redis Cache: Larger Cache Sizes
  • Storage: Support for Zone Redundant Storage
  • SDK: Tons of great VS and SDK improvements

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Virtual Machines: Support in the new Azure Preview portal

We previewed the new Azure Preview Portal at the //Build conference earlier this year.  It brings together all of your Azure resources in a single management portal, and makes it easy to build cloud applications on the Azure platform using our new Azure Resource Manager (which enables you to manage multiple Azure resources as a single application).  The initial preview of the portal supported Web Sites, SQL Databases, Storage, and Visual Studio Online resources.

This past month we’ve extended the preview portal to also now support Virtual Machines.  You can create standalone VMs using the portal, or group multiple VMs (and PaaS services) together into a Resource Group and manage them as a single logical entity. You can use the preview portal to get deep insights into billing and monitoring of these resources, and customize the portal to view the data however you want.  If you are an existing Azure customer you can start using the new portal today: http://portal.azure.com.

Below is a screen-shot of the new portal in action.  The service dashboard showing service/region health can be seen in the top-left of the portal, along with billing data about my subscriptions – both make it really easy for you to see the health and usage of your services in Azure.  In the screen-shot below I have a single VM running named “scottguvstest” – and clicking the tile for it displays a “blade” of additional details about it to the right – including integrated performance monitoring usage data:

image

The initial “blade” for a VM provides a summary view of common metrics about it.  You can click any of the titles to get even more detailed information as well. 

For example, below I’ve clicked the CPU monitoring title in my VM, which brought up a Metric blade with even more details about CPU utilization over the last few days.  I’ve then clicked the “Add Alert” command within it to setup an automatic alert that will trigger (and send an email to me) any time the CPU of the VM goes above 95%:

image

In the screen-shot below, I’ve clicked the “Usage” tile within the VM blade, which displays details about the different VM sizes available – and what each VM size provides in terms of CPU, memory, disk IOPS and other capabilities.  Changing the size of the VM being used is as simple as clicking another of the pricing tiles within the portal – no redeployment of the VM required:

image

SharePoint Farm support via the Azure Gallery

Built-into the Azure Preview Portal is a new “Azure Gallery” that provides an easy way to deploy a wide variety of VM images and online services.  VM images in the Azure Gallery include Windows Server, SQL Server, SharePoint Server, Ubuntu, Oracle, Baracuda images. 

Last month, we also enabled a new “SharePoint Server Farm” gallery item.  It enables you to easily configure and deploy a highly-available SharePoint Server Farm consisting of multiple VM images (databases, web servers, domain controllers, etc) in only minutes.  It provides the easiest way to create and configure SharePoint farms anywhere:

image

Over the next few months you’ll see even more items show up in the gallery – enabling a variety of additional new scenarios.  Try out the ones in the gallery today by visiting the new Azure portal: http://portal.azure.com/

Machine Learning: Preview of new Machine Learning Service for Azure

Last month we delivered the public preview of our new Microsoft Azure Machine Learning service, a game changing service that enables your applications and systems to significantly improve your organization’s understanding across vast amounts of data. Azure Machine Learning (Azure ML) is a fully managed cloud service with no software to install, no hardware to manage, and no OS versions or development environments to grapple with. Armed with nothing but a browser, data scientists can log into Azure and start developing Machine Learning models from any location, and from any device.

ML Studio, an integrated development environment for Machine Learning, lets you set up experiments as simple data flow graphs, with an easy to use drag, drop and connect paradigm. Data scientists can use it to avoid programming a large number of common tasks, allowing them to focus on experiment design and iteration. A collection of best of breed algorithms developed by Microsoft Research comes built-in, as is support for custom R code – and over 350 open source R packages can be used securely within Azure ML today.

image

Azure ML also makes it simple to create production deployments at scale in the cloud. Pre-trained Machine Learning models can be incorporated into a scoring workflow and, with a few clicks, a new cloud-hosted REST API can be created.

Azure ML makes the incredible potential of Machine Learning accessible both to startups and large enterprises. Startups are now able to immediately apply machine learning to their applications. Larger enterprises are able to unleash the latent value in their big data to generate significantly more revenue and efficiencies. Above all, the speed of iteration and experimentation that is now possible will allow for rapid innovation and pave the way for intelligence in cloud-connected devices all around us.

Getting Started

Getting started with the Azure Machine Learning Service is easy.  Within the current Azure Portal simply choose New->Data Services->Machine Learning to create your first ML service today:

image

Subscribe to the Machine Learning Team Blog to learn more about the Azure Machine Learning service.  And visit our Azure Machine Learning documentation center to watch videos and explore tutorials on how to get started immediately.

Event Hub: Preview of new Azure Event Ingestion Service

Today’s connected world is defined by big data.  Big data may originate from connected cars and thermostats that produce telemetry data every few minutes, application performance counters that generate events every second or mobile apps that capture telemetry for every user’s individual action. The rapid proliferation of connected devices raises challenges due to the variety of platforms and protocols involved.  Connecting these disparate data sources while handling the scale of the aggregate stream is a significant challenge. 

I’m happy to announce the public preview of a significant new Azure service: Event Hub. Event Hub is a highly scalable pub-sub ingestor capable of elastic scale to handle millions of events per second from millions of connected devices so that you can process and analyze the massive amounts of data produced by your connected devices and applications. With this new service, we now provide an easy way for you to provision capacity for ingesting events from a variety of sources, and over a variety of protocols in a secure manner. Event Hub supports a variety of partitioning modes to enable parallelism and scale in your downstream processing tier while preserving the order of events on a per device basis.

Creating an Event Hub

You can easily create a new instance of Event Hub from the Azure Management Portal by clicking New->App Services->Service Bus->Event Hub. During the Preview, Event Hub service is available in a limited number of regions (East US 2, West Europe, Southeast Asia) and requires that you first create a new Service Bus Namespace:

image

Learn More

Try out the new Event Hub service and give us your feedback! For more information, visit the links below:

Mobile Services: General Availability of .NET Support, SignalR and Offline Sync

A few months ago I announced a preview of Mobile Services with .NET backend support. Today I am excited to announce the general availability of the Mobile Services .NET offering, which makes it an incredibly attractive choice for developers building mobile facing backend APIs using .NET.  Using Mobile Services you can now:

  • Quickly add a fully featured backend to your iOS, Android, Windows, Windows Phone, HTML or cross-platform Xamarin, Sencha, or PhoneGap app, leveraging ASP.NET Web API, Mobile Services, and corresponding Mobile Services client SDKs.
  • Publish any existing ASP.NET Web API to Azure and have Mobile Services monitor and manage your Web API controllers for you.
  • Take advantage of built-in mobile capabilities like push notifications, real-time notifications with SignalR, enterprise sign-on with Azure Active Directory, social auth, offline data sync for occasionally connected scenariosYou can also take full advantage of Web API features like OData controllers, and 3rd party Web API-based frameworks like Breeze.
  • Have your mobile app’s users login via Azure Active Directory and securely access enterprise assets such as SharePoint and Office 365. In addition, we've also enabled seamless connectivity to on-premises assets, so you can reach databases and web services that are not exposed to the Internet and behind your company’s firewall.
  • Build, test, and debug your Mobile Services .NET backend using Visual Studio running locally on your machine or remotely in Azure.

You can learn more about Mobile Services .NET from this blog post, and the Mobile Services documentation center.

Real-time Push with Mobile Services and SignalR

We recently released an update to our Mobile Services .NET backend support which enables you to use ASP.NET SignalR for real-time, bi-directional communications with your mobile applications. SignalR will use WebSockets under the covers when it's available, and fallback to other “techniques” (i.e. HTTP hacks) when it isn't. Regardless of the mode, your application code stays the same.

The SignalR integration with Azure Mobile Services includes:

  • Turnkey Web API Integration: Send messages to your connected SignalR applications from any Web API controller or scheduled job – we automatically give you access to SignalR Hubs from the ApiServices context.
  • Unified Authentication: Protect your SignalR Hubs the same way you protect any of your Mobile Service Web API controllers using a simple AuthorizeLevel attribute.
  • Automatic Scale-out: When scaling out your Azure Mobile Service using multiple front-ends, we automatically scale out SignalR using Azure Service Bus as the backplane for sync’ing between the front-ends. You don’t need to do anything to scale your SignalR Hubs.

Learn more about the SignalR capability in Mobile Services from Henrik’s blog.

Mobile Services Offline Sync support for Xamarin and native iOS apps

I've blogged earlier about the new Offline Sync feature in Mobile Services, which provides a lightweight, cross-platform way for applications to work with data even when they are offline / disconnected from the network. At that time we released Offline Sync support for Windows Phone and Windows Store apps.

Today we are also introducing a preview of Mobile Services Offline Sync for native iOS apps, as well as Xamarin.iOS, and Xamarin.Android.

Mobile Services Accelerators

I’m pleased to also introduce our new Mobile Services Accelerators, which are feature complete sample apps that demonstrate how to leverage the new enterprise features of the Mobile Services platform in an end-to-end scenario. We will have two accelerator apps for you today, available as a source code, as well as published in the app store.

These apps leverage the Mobile Services .NET backend support to handle authenticating employees with Azure Active Directory, store data securely, working with data offline, as well as get reminders via push notifications. We hope you will find these apps useful for your teams as a reference material. Stay tuned, as more accelerators are coming!

Notification Hubs: Price reductions and new features

The Azure Notification Hubs service enables large scale cross platform push notifications from any server backend running on-premise or in the cloud.  It supports a variety of mobile devices including iOS, Android, Windows, Kindle Fire, and Nokia X. I am excited to announce several great updates to Azure Notification Hubs today:

  • Price reduction. We are reducing the Notification Hubs price by up to 40x to accommodate a wider range of customer scenarios. With the new price (effective September 1st), customers can send 1 million mobile push notifications per month for free, and pay $1 per additional million pushes using our new Basic tier. Visit the Notification Hubs pricing page for more details.
  • Scheduled Push. You can now use Notification Hubs to schedule individual and broadcast push notifications at certain times of the day. For example, you can use this feature to schedule announcements to be delivered in the morning to your customers.  We include support to enable this no matter which time zone your customers are in.
  • Bulk Registration management. You can now send bulk jobs to create, update or export millions of mobile device registrations at a time with a single API call. This is useful if you are moving from an old push notification system to Notification Hubs, or to import user segments from a 3rd party analytics system.

You can learn more about Azure Notification Hubs at the developer center.

SQL Databases: New Geo-Restore, Geo-Replication and Auditing support

In April 2014, we first previewed our new SQL Database service tiers: Basic, Standard, and Premium. Today, I’m excited to announce the addition of more features to the preview:

  • Geo-restore: Designed for emergency data recovery when you need it most, geo-restore allows you to recover a database to any Azure region. Geo-restore uses geo-redundant Azure blob storage for automatic database backups and is available for Basic, Standard, and Premium databases in the Windows Azure Management Portal and REST APIs.
  • Geo-replication: You can now configure your SQL Databases to use our built-in geo-replication support that enables you to setup an asynchronously replicated secondary SQL Database that can be failed over to in the event of disaster.  Geo-replication is available for Standard and Premium databases, and can be configured via the Windows Azure Management portal and REST APIs. You can get more information about Azure SQL Database Business Continuity and geo-replication here and here.
  • Auditing: Our new auditing capability tracks and logs events that occur in your database and provides dashboard views and reports that enables you to get insights into these events. You can use auditing to streamline compliance-related activities, gain knowledge about what is happening in your database, and to identify trends, discrepancies and anomalies. Audit events are also written to an audit log which is stored in a user-designated Azure storage account.  Auditing is now available for all Basic, Standard, and Premium databases.

You can learn even more about these new features here.

Redis Cache: Large Cache Sizes, Six New Regions, Redis MaxMemory Policy Support

This past May, we launched the public preview of the new Azure Redis Cache service. This cache service gives you the ability to use a secure, dedicated Redis cache, managed as a service by Microsoft. Using the new Cache service, you get to leverage the rich feature set and ecosystem provided by Redis, and reliable hosting and monitoring from Microsoft.

Last month we updated the service with the following features:

  • Support for larger cache sizes. We now support the following sizes: 250 MB, 1 GB, 2.5 GB, 6 GB, 13 GB and 26 GB. 
  • Support for six new Azure Regions. The full list of supported regions can be found in the Azure Regions page.
  • Support for configuring Redis MaxMemory policy

For more information on the Azure Redis Cache, check out this blog post: Lap around Azure Redis Cache.

Storage: Support for Zone Redundant Storage

We are happy to introduce a new Azure Storage account offering: Zone Redundant Storage (ZRS).

ZRS replicates your data across 2 to 3 facilities either within a single Azure region or across two Azure regions. If your storage account has ZRS enabled, then your data is durable even in the case where one of the datacenter facilities hosting your data suffers a catastrophic issue. ZRS is also more cost efficient than the existing Global Redundant Storage (GRS) offering we have today.

You can create a ZRS storage account by simply choosing the ZRS option under the replication dropdown in the Azure Management Portal.

image

You can find more information on pricing for ZRS at http://azure.microsoft.com/en-us/pricing/details/storage/.

Azure SDK: WebSites, Mobile, Virtual Machines, Storage and Cloud Service Enhancements

Earlier today we released the Update 3 release of Visual Studio 2013 as well as the new Azure SDK 2.4 release.  These updates contain a ton of great new features that make it even easier to build solutions in the cloud using Azure.  Today’s updates include:

Visual Studio Update 3

  • Websites: Publish WebJobs from Console or Web projects.
  • Mobile Services: Create a Dev/Test environment in the cloud when creating Mobile Services projects. Use the Push Notification Wizard with .NET Mobile Services.
  • Notification Hubs: View and manage device registrations.

Azure SDK 2.4

  • Virtual Machines: Remote debug 32-bit Virtual Machines. Configure Virtual Machines, including installation & configuration of dynamic extensions (e.g. anti-malware, Puppet, Chef and custom script). Create Virtual Machine snapshots of the disk state.
  • Storage: View Storage activity logs for diagnostics. Provision Read-Access Geo-redundant Storage from Visual Studio.
  • Cloud Services: Emulator Express is the default option for new projects (Full Emulator is deprecated). Configure new networking capabilities in the service model.

You can learn all about the updates from the Azure team’s SDK announcement blog post.

Summary

This most recent release of Azure includes a bunch of great features that enable you to build even better cloud solutions.  If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Free ebook: Building Cloud Apps with Microsoft Azure

9780735695658f

Last week MS Press published a free ebook based on the Building Real-World Apps using Azure talks I gave at the NDC and TechEd conferences.  The talks + book walks through a patterns-based approach to building real world cloud solutions, and help make it easier to understand how to be successful with cloud development.

Videos of the Talks

You can watch a video recording of the talks I gave here:

 Part 1: Building Real World Cloud Apps with Azure

 Part 2: Building Real World Cloud Apps with Azure

eBook Downloads

You can now download a completely free PDF, Mobi or ePub version of the ebook based on the talks using the links below:

Download the PDF (6.35 MB)  

Download the EPUB file (12.3 MB)  

Download the Mobi for Kindle file (22.7 MB)

Hope this helps,

Scott

omni

Azure: VM Security Extensions, ExpressRoute GA, Reserved IPs, Internal Load Balancing, Multi Site-to-Site VPNs, Storage Import/Export GA, New SMB File Service, API Management, Hybrid Connection Service, Redis Cache, Remote Apps and more…

This morning we released a massive amount of enhancements to Microsoft Azure.  Today’s new capabilities and announcements include:

  • Virtual Machines: Integrated Security Extensions including Built-in Anti-Virus Support and Support for Capturing VM images in the portal
  • Networking: ExpressRoute General Availability, Multiple Site-to-Site VPNs, VNET-to-VNET Secure Connectivity, Reserved IPs, Internal Load Balancing
  • Storage: General Availability of Import/Export service and preview of new SMB file sharing support
  • Remote App: Public preview of Remote App Service – run client apps in the cloud
  • API Management: Preview of the new Azure API Management Service
  • Hybrid Connections: Easily integrate Azure Web Sites and Mobile Services with on-premises data+apps (free tier included)
  • Cache: Preview of new Redis Cache Service
  • Store: Support for Enterprise Agreement customers and channel partners

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Virtual Machines: Integrated Security Extensions including Built-in Anti-Virus Support

In a previous blog post I talked about the new VM Agent we introduced as an optional extension to Virtual Machines hosted on Azure.  The VM Agent is a lightweight and unobtrusive process that you can optionally enable to run inside your Windows and Linux VMs. The VM Agent can then be used to install and manage extensions, which are software modules that extend the functionality of a VM and help make common management scenarios easier.

Today I’m pleased to announce three new security extensions that we are enabling via the VM Agent:

  • Microsoft Antimalware
  • Symantec Endpoint Protection
  • TrendMicro’s Deep Security Agent

These extensions enable you to add richer security protection to your Virtual Machines using respected security products that we automate installing/managing.  These extensions are easy to enable within your Virtual Machines through either the Azure Management Portal or via the command-line.  To enable them using the Azure Management Portal simply check them when you create new a new Virtual Machine:

image

Once checked we’ll automate installing and running them within your VM.

Custom Powershell Script

This week we’ve also enabled a new “Custom Script” extension that enables you to specify a Powershell script file (.ps1 extension) to run in the VM immediately after it’s created.  This provides another way to customize your VM on creation without having to RDP in.  Alternatively you can also take advantage of the Chef and Puppet extensions we shipped last month.

Virtual Machines: Support for Capturing Images with both OS + Data Drives attached

Last month at the //Build conference we released command-line support for capturing VM images that contain both an OS disk as well as multiple data disks attached.  This new VM image support made it much easier to capture and automate VMs with richer configurations, as well as to snapshot VMs without having to run sysprep on them. 

With today’s release we have updated the Azure Management Portal to add support for capturing VM images that contain both an OS disk and multiple data disks as well.  One cool aspect of the “Capture” command is that it can now be run on both a stopped VM, as well as on a running VM as well (there is no need to restart it and the capture command completes in under a minute). 

To try this new support out, simply click the “Capture” button on a VM, and it will present a dialog that enables you to name the image you want to create: 

image

Once the image is captured it will show up in the “Images” section of the VM gallery – allowing to you easily create any number of new VM instances from it:

image

This new support is ideal for dev/test scenarios as well as for creating re-usable images for use with any other VM creation scenario.

Networking: General Availability of Azure ExpressRoute

I’m excited to announce the general availability release today of the Azure ExpressRoute service.

ExpressRoute enables dedicated, private, high-throughput network connectivity between Azure datacenters and your on-premises IT environments. Using ExpressRoute, you can connect your existing datacenters to Azure without having to flow any traffic over the public Internet, and enable–guaranteed network quality-of-service and the ability to use Azure as a natural extension of an existing private network or datacenter.  As part of our GA release we now offer an enterprise SLA for the service, as well as a variety of bandwidth tiers.

We have previously announced several provider partnerships with ExpressRoute including with: AT&T, Equinix, Verizon, BT, and Level3.  This week we are excited to announce new partnerships with TelecityGroup, SingTel and Zadara as well.  You can use any of these providers to setup private fiber connectivity directly to Azure using ExpressRoute.

You can get more information on the ExpressRoute website.

Networking: Multiple Site-to-Site VPNs and VNET-to-VNET Connectivity

I’m excited to announce the general availability release of two highly requested virtual networking features: multiple site-to-site VPN support and VNET-to-VNET connectivity.

Multiple Site to Site VPNs

Virtual Networks in Azure now supports more than one site-to-site connection, which enables you to securely connect multiple on-premises locations with a Virtual Network (VNET) in Azure. Using more than one site-to-site connection comes at no additional cost. You incur charges only for the VNET gateway uptime.

clip_image032

VNET to VNET Connectivity

With today’s release, we are also enabling VNET-to-VNET connectivity. That means that multiple virtual networks can now be directly and securely connected with one another. Using this feature, you can connect VNETs that are running in the same or different Azure regions and in case of different Azure regions have the traffic securely route via the Microsoft network backbone.

This feature enables scenarios that require presence in multiple regions (e.g. Europe and US, or East US and West US), applications that are highly available, or the integration of VNETs within a single region for a much larger network. This feature also enables you to connect VNETs across multiple different Azure account subscriptions, so you can now connect workloads across different divisions of your organization, or even different companies. The data traffic flowing between VNETs is charged at the same rate as egress traffic.

clip_image034

You can get more information on the Virtual Network website.

Networking: IP Reservation, Instance-level public IPs, Internal Load Balancing Support, Traffic Manager 

With today’s release we are also making available three highly request IP address features:

IP Reservations

With IP reservation, you can now reserve public IP addresses and use them as virtual IP (VIP) addresses for your applications. This enables scenarios where applications need to have static public IP addresses, and you want to be able to have the IP address survive the application being deleted and redeployed.  You can now reserve up to 5 addresses per subscription free of charge and assign them to VM or Cloud Service instances of your choice. If additional VIP reservations are needed, you can also reserve more addresses at additional cost.

This feature is now generally available as of today.  You can enable it via the command-line using new powershell cmdlets that we now support:

#Reserve a IP
New-AzureReservedIP -ReservedIPName EastUSVIP -Label "Reserved VIP in EastUS" -Location "East US"

#Use the Reserved IP during deployment
New-AzureVM -ServiceName "MyApp" -VMs $web1 -Location "East US" -VNetName VNetUSEast -ReservedIPName EastUSVIP

We will enable portal management support in a future management portal update.

Public IP Address per Virtual Machine

With Instance-level Public IPs for VMs, you can now assign public IP addresses to your virtual machines, so they become directly addressable without having to map an endpoint through a VIP. This feature will enable scenarios like easily running FTP servers in Azure and monitoring virtual machines directly using their IPs. 

We are making this new capability available in preview form today.  This feature is available only with new deployments and new virtual networks and can be enabled via PowerShell.

Internal Load Balancing (ILB) Support

Today’s new Internal Load Balancing support enables you to load-balance Azure virtual machines with a private IP address. The internally load balanced IP address will be accessible only within a virtual network (if the VM is within a virtual network) or within a cloud service (if the VM isn’t within a virtual network) – and means that no one outside of your application can access it. Internal Load Balancing is useful when you’re creating applications in which some of the tiers (for example: the database layer) aren’t public facing but require load balancing functionality. Internal Load Balancing is available in the standard tier of VMs at no additional cost.

We are making this new capability available in preview form today. ILB is available only with new deployments and new virtual networks and can be accessed via PowerShell.

Traffic Manager support for external endpoints

Starting today, Traffic Manager now supports routing traffic to both Azure endpoints and external endpoints (previously it only supported Azure endpoints).

Traffic Manager enables you to control the distribution of user traffic to your specified endpoints. With support for endpoints that reside outside of Azure, you can now build highly available applications that span both Azure, on-premises environments, and even other cloud providers. You can apply intelligent traffic management policies across all managed endpoints. This functionality is available now in preview and you can manage it via the command-line using powershell.

Learning More

You can learn more about Reserved IP addresses and the above networking features here.

Storage: General Availability Release of Azure Import/Export Service

Last November, we launched the preview of our Microsoft Azure Import/Export Service. Today, I am excited to announce the general availability release of the service.

The Microsoft Azure Import/Export Service enables you to move large amounts of data into and out of your Microsoft Azure Storage accounts by transferring them on hard disks. You can ship encrypted hard drives directly to our Microsoft Azure data centers, and we will automatically transfer the data to or from your Microsoft Azure Blobs for your storage account.  This enables you to import or export massive amounts of data quickly, cost effectively, and without being constrained by your network bandwidth.

This release of the Import/Export service has several new features as well as improvements to the preview functionality. We have expanded our service to new regions in addition to the US. We are now available in the US, Europe and the Asia Pacific regions. You can also now use either FedEx or DHL to ship the drives.  Simply provide an appropriate Fedex/DHL account number and we will also automatically ship the drives back to you:

clip_image028

More details about the improvements and new features of the Import/Export service can be found on the Microsoft Azure Storage Team Blog. Check out the Getting Started Guide to learn about how to use the Import/Export service. Feel free to send questions and comments to the waimportexport@microsoft.com.

Storage: New SMB File Sharing Service

I’m excited to announce the preview of the new Microsoft Azure File Service. The Azure File Service is a new capability of our existing Azure storage system and supports exposing network file shares using the standard SMB protocol.  Applications running in Azure can now easily share files across Windows and Linux VMs using this new SMB file-sharing service, with all VMs having both read and write access to the files.  The files stored within the service can also be accessed via a REST interface, which opens a variety of additional non-SMB sharing scenarios.

The Azure File Service is built on the same technology as the Blob, Table, and Queue Services, which means Azure Files is able to leverage the existing availability, durability, scalability, and geo redundancy that is built into our Storage platform. It is provided as a high-availability managed service run by us, meaning you don’t have to manage any VMs to coordinate it and we take care of all backups and maintenance for you.

Common Scenarios

  • Lift and Shift applications: Azure Files makes it easier to “lift and shift” existing applications to the cloud that use on-premise file shares to share data between parts of the application.
  • Shared Application Settings: A common pattern for distributed applications is to have configuration files in a centralized location where they can be accessed from many different virtual machines. Such configuration files can now be stored in an Azure File share, and read by all application instances.
  • Diagnostic Share: An Azure File share can also be used to save diagnostic files like logs, metrics, and crash dumps. Having these available through both the SMB and REST interface allows applications to build or leverage a variety of analysis tools for processing and analyzing the diagnostic data.
  • Dev/Test/Debug: When developers or administrators are working on virtual machines in the cloud, they often need a set of tools or utilities. Installing and distributing these utilities on each virtual machine where they are needed can be a time consuming exercise. With Azure Files, a developer or administrator can store their favorite tools on a file share, which can be easily connected to from any virtual machine.

To learn more about how to use the new Azure File Service visit here.

RemoteApp: Preview of new Remote App Service

I’m happy to announce the public preview today of Azure RemoteApp, a new service delivering Windows Client applications from the Azure cloud.

Azure RemoteApp can be used by IT to enable employees to securely access their corporate applications from a variety of devices (including mobile devices like iPads and Phones).  Applications can be scaled up or down quickly without expensive infrastructure costs and management complexity.

With Azure RemoteApp, your client applications run in the Azure cloud. Employees simply install the Microsoft Remote Desktop client on their devices and then can access applications via Microsoft’s Remote Desktop Protocol (RDP).  IT can optionally connect the applications back to on-premises networks (enabling hybrid connectivity) or alternatively run them entirely in the cloud.

clip_image048

With this service, you can bring scale, agility and global access to your business applications.

Azure RemoteApp is free during preview period. Learn more about Azure RemoteApp and try the service free during preview.

Hybrid Connections: Easily integrate Azure Websites and Mobile Services with on-premises resources

I’m excited to announce Hybrid Connections, a new and easy way to build hybrid applications on Azure. Hybrid Connections enable your Azure Website or Mobile Service to connect to on-premises data & services with just a few clicks within the Azure Management portal.  Today, we're also introducing a Free tier of Azure BizTalk Services that enables everyone to use this new hybrid connections feature for free.

With Hybrid Connections, Azure websites and mobile services can easily access on-premises resources as if they were located on the same private network. This makes it much easier to move applications to the cloud, while still connecting securely with existing enterprise assets.

image

Hybrid Connections support all languages and frameworks supported by Azure Websites (.NET, PHP, Java, Python, node.js) and Mobile Services (node.js, .NET).

The Hybrid Connections service does not require you to enable a VPN or open up firewall rules in order to use it. This makes it easy to deploy within enterprise environments.  Built-in monitoring and management support still enables enterprise administrators control and visibility into the resources accessed by their hybrid applications.

You can learn more about Hybrid Connections using the following links:

API Management: Announcing Preview of new Azure API Management Service

With the proliferation of mobile devices, it is important for organizations to be able to expose their existing backend systems via mobile-friendly APIs that enable internal app developers as well as external developer programs. Today, I’m excited announce the public preview of the new Azure API Management service that helps you better achieve this.

The new Azure API management service allows you to create an easy to use API façade over a diverse set of mobile backend services (including Mobile Services, Web Sites, VMs, Cloud Services and on-premises systems), and enables you to deliver a friendly API developer portal to your customers with documentation and samples, enable per-developer metering support that protects your APIs from abuse and overuse, and enable to you monitor and track API usage analytics:

clip_image014

Creating an API Management service

You can easily create a new instance of the Azure API Management service from the Azure Management Portal by clicking New->App Services->API Management->Create. Once the service has been created, you can get started on your API by clicking on the Manage button and transitioning to the Dashboard page on the Publisher portal.

clip_image015

Publishing an API

A typical API publishing workflow involves creating an API: first creating a façade over an existing backend service, and then configuring policies on it and packaging/publishing the API to the Developer portal for developers to be able to consume.

To create an API, select the Add API button within the publisher portal, and in the dialog that appears enter the API name, location of the backend service and suffix of the API root under the service domain name.  Note that you can implement the back-end of the API anywhere (including non-Azure cloud providers or locations).  You can also obviously host the API using Azure – including within a VM, Cloud Service, Web Site or Mobile Service.

Once you’ve defined the settings, click Save to create the API endpoint:

clip_image017

 

Once you’ve defined have created your API endpoint, you can customize it.  You can also set policies such as caching rules, and usage quotas and rate limits that you can apply for developers calling the API. These features end up being extremely useful when publishing an API for external developers (or mobile apps) to consume, and help ensure that your APIs cannot be abused.

Developer Portal

Once your API has been published, click on the Developer Portal link.  This will launch a developer portal page that can be used by developers to learn how to consume and use the API that you have published.  It provides a bunch of built-in support to help you create documentation pages for your APIs, as well as built-in testing tools.  You’ll also get an impressive list of copy-and-paste-ready code samples that help teach developers how to invoke your APIs from the most popular programming languages.  Best of all this is all automatically generated for you:

clip_image023

You can test out any of the APIs you’ve published without writing a line code by using the interactive console.

clip_image025

Analytics and reports

Once your API is published, you’ll want to be able to track how it is being used.  Back in the publisher portal you can click on the Analytics page to find reports on various aspects of the API, such as usage, health, latency, cache efficiency and more. With a single click, you can find out your most active developers and your most popular APIs and products. You can get time series metrics as well maps to show what geographies drive them.

clip_image027

Learn More

We are really excited about the new API Management service, and it is going to make securely publishing and tracking external APIs much simpler.  To learn more about API Management, follow the tutorials below:

Cache: New Azure Redis Cache Service

I’m excited to announce the preview of a new Azure Redis Cache Service.

This new cache service gives customers the ability to use a secure, dedicated Redis cache, managed by Microsoft. With this offer, you get to leverage the rich feature set and ecosystem provided by Redis, and reliable hosting and monitoring from Microsoft.

We are offering the Azure Redis Cache Preview in two tiers:

  • Basic – A single Cache node (ideal for dev/test and non-critical workloads)
  • Standard – A replicated Cache (Two nodes, a Master and a Slave)

During the preview period, the Azure Redis Cache will be available in a 250 MB and 1 GB size. For a limited time, the cache will be offered free, with a limit of two caches per subscription.

Creating a New Cache Instance

Getting started with the new Azure Redis Cache is easy.  To create a new cache, sign in to the Azure Preview Portal, and click New -> Redis Cache (Preview):

image

Once the new cache options are configured, click Create. It can take a few minutes for the cache to be created. After the cache has been created, your new cache has a Running status and is ready for use with default settings:

clip_image042

Connect to the Cache

Application developers can use a variety of languages and corresponding client packages to connect to the Azure Redis Cache. Below we’ll use a .NET Redis client called StackExchange.Redis to connect to the cache endpoint. You can open any Visual Studio project and add the StackExchange.Redis NuGet package to it, via the NuGet package manager.

The cache endpoint and key can be obtained respectively from the Properties blade and the Keys blade for your cache instance within the Azure Preview Portal:

clip_image046

Once you’ve retrieved these you can create a connection instance to the cache with the code below:

var connection = StackExchange.Redis

                                                   .ConnectionMultiplexer.Connect("contoso5.redis.cache.windows.net,ssl=true,password=...");

Once the connection is established, you can retrieve a reference to the Redis cache database, by calling the ConnectionMultiplexer.GetDatabase method.

IDatabase cache = connection.GetDatabase();

Items can be stored in and retrieved from a cache by using the StringSet and StringGet methods.

cache.StringSet("Key1", "HelloWorld");

string value = cache.StringGet("Key1");

You have now stored and retrieved a “Hello World” string from a Redis cache instance running on Azure.

Learn More

For more information, visit the following links:

Store: Support for EA customers and channel partners in the Azure Store

With today’s update we are expanding the Azure Store to customers and channel partners subscribed to Azure via a direct Enterprise Agreement (EA). Azure EA customers in North America and Europe can now purchase a range of application and data services from 3rd party providers through the Store and have these subscriptions automatically billed against their EA.

image

You will be billed against your EA each quarter for all of your Store purchases on a separate, consolidated invoice.  Access to Azure Store can be managed by your EA Azure enrollment administrators, by going to Manage Accounts and Subscriptions under the Accounts section in the Enterprise Portal, where you can disable or re-enable access to 3rd party purchases via Store.  Please visit Azure Store to learn more.

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microosft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Azure: 99.95% SQL Database SLA, 500 GB DB Size, Improved Performance Self-Service Restore, and Business Continuity

Earlier this month at the Build conference, we announced a number of great new improvements coming to SQL Databases on Azure including: an improved 99.95% SLA, support for databases up to 500GB in size, self-service restore capability, and new Active Geo Replication support.  This 3 minute video shows a segment of my keynote where I walked through the new capabilities:

image

Last week we made these new capabilities available in preview form, and also introduced new SQL Database service tiers that make it easy to take advantage of them.

New SQL Database Service Tiers

Last week we introduced a new Basic and Standard tier option with SQL Databases – which are additions to the existing Premium tier we previously announced.  Collectively these tiers provide a flexible set of offerings that enable you to cost effectively deploy and host SQL Databases on Azure:

  • Basic Tier: Designed for applications with a light transactional workload. Performance objectives for Basic provide a predictable hourly transaction rate.
  • Standard Tier: Standard is the go-to option for cloud-designed business applications. It offers mid-level performance and business continuity features. Performance objectives for Standard deliver predictable per minute transaction rates.
  • Premium Tier: Premium is designed for mission-critical databases. It offers the highest performance levels and access to advanced business continuity features. Performance objectives for Premium deliver predictable per second transaction rates.

You do not need to buy a SQL Server license in order to use any of these pricing tiers – all of the licensing and runtime costs are built-into the price, and the databases are automatically managed (high availability, auto-patching and backups are all built-in).  We also now provide you the ability to pay for the database at the per-day granularity (meaning if you only run the database for a few days you only pay for the days you had it – not the entire month). 

The price for the new SQL Database Basic tier starts as low as $0.16/day ($4.96 per month) for a 2 GB SQL Database.  During the preview period we are providing an additional 50% discount on top of these prices.  You can learn more about the pricing of the new tiers here.

Improved 99.95% SLA and Larger Database Sizes

We are extending the availability SLA of all of the new SQL Database tiers to be 99.95%.  This SLA applies to the Basic, Standard and Premium tier options – enabling you to deploy and run SQL Databases on Azure with even more confidence.

We are also increasing the maximum sizes of databases that are supported:

  • Basic Tier: Supports databases up to 2 GB in size
  • Standard Tier: Supports databases up to 250 GB in size. 
  • Premium Tier: Supports databases up to 500 GB in size.

Note that the pricing model for our service tiers has also changed so that you no longer need to pay a per-database size fee (previously we charged a per-GB rate) - instead we now charge a flat rate per service tier.

Predictable Performance Levels with Built-in Usage Reports

Within the new service tiers, we are also introducing the concept of performance levels, which are a defined level of database resources that you can depend on when choosing a tier.  This enables us to provide a much more consistent performance experience that you can design your application around.

The resources of each service tier and performance level are expressed in terms of Database Throughput Units (DTUs). A DTU provides a way to describe the relative capacity of a performance level based on a blended measure of CPU, memory, and read and write rates. Doubling the DTU rating of a database equates to doubling the database resources.  You can learn more about the performance levels of each service tier here.

Monitoring your resource usage

You can now monitor the resource usage of your SQL Databases via both an API as well as the Azure Management Portal.  Metrics include: CPU, reads/writes and memory (not available this week but coming soon),  You can also track your performance usage relative (as a percentage) to the available DTU resources within your service tier level:

Performance Metircs

Dynamically Adjusting your Service Tier

One of the benefits of the new SQL Database Service Tiers is that you can dynamically increase or decrease them depending on the needs of your application.  For example, you can start off on a lower service tier/performance level and then gradually increase the service tier levels as your application becomes popular and you need more resources. 

It is quick and easy to change between service tiers or performance levels — it’s a simple online operation.  Because you now pay for SQL Databases by the day (as opposed to the month) this ability to dynamically adjust your service tier up or down also enables you to leverage the elastic nature of the cloud and save money.

Read this article to learn more about how performance works in the new system and the benchmarks for each service tier.

New Service-Service Restore Support

Have you ever had that sickening feeling when you’ve realized that you inadvertently deleted data within a database and might not have a backup?  We now have built-in Service Service Restore support with SQL Databases that helps you protect against this.  This support is available in all service tiers (even the Basic Tier).

SQL Databases now automatically takes database backups daily and log backups every 5 minutes. The daily backups are also stored in geo-replicated Azure Storage (which will store a copy of them at least 500 miles away from your primary region).

Using the new self-service restore functionality, you can now restore your database to a point in time in the past as defined by the specified backup retention policies of your service tier:

  • Basic Tier: Restore from most recent daily backup
  • Standard Tier: Restore to any point in last 7 days
  • Premium Tier: Restore to any point in last 35 days

Restores can be accomplishing using either an API we provide or via the Azure Management Portal:

clip_image004

New Active Geo-replication Support

For Premium Tier databases, we are also adding support that enables you to create up to 4 readable, secondary, databases in any Azure region.  When active geo-replication is enabled, we will ensure that all transactions committed to the database in your primary region are continuously replicated to the databases in the other regions as well:

image

One of the primary benefits of active geo-replication is that it provides application control over disaster recovery at a database level.  Having cross-region redundancy enables your applications to recover in the event of a disaster (e.g. a natural disaster, etc). 

The new active geo-replication support enables you to initiate/control any failovers – allowing you to shift the primary database to any of your secondary regions:

image

This provides a robust business continuity offering, and enables you to run mission critical solutions in the cloud with confidence.  You can learn more about this support here.

Starting using the Preview of all of the Above Features Today!

All of the above features are now available to starting using in preview form. 

You can sign-up for the preview by visiting our Preview center and clicking the “Try Now” button on the “New Service Tiers for SQL Databases” option.  You can then choose which Azure subscription you wish to enable them for.  Once enabled, you can immediately start creating new Basic, Standard or Premium SQL Databases.

Summary

This update of SQL Database support on Azure provides some great new features that enable you to build even better cloud solutions.  If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Azure Updates: Web Sites, VMs, Mobile Services, Notification Hubs, Storage, VNets, Scheduler, AutoScale and More

It has been a really busy last 10 days for the Azure team. This blog post quickly recaps a few of the significant enhancements we’ve made.  These include:

  • Web Sites: SSL included, Traffic Manager, Java Support, Basic Tier
  • Virtual Machines: Support for Chef and Puppet extensions, Basic Pricing tier for Compute Instances
  • Virtual Network: General Availability of DynamicRouting VPN Gateways and Point-to-Site VPN
  • Mobile Services: Preview of Visual Studio support for .NET, Azure Active Directory integration and Offline support;
  • Notification Hubs: Support for Kindle Fire devices and Visual Studio Server Explorer integration
  • Autoscale: General Availability release
  • Storage: General Availability release of Read Access Geo Redundant Storage
  • Active Directory Premium: General Availability release
  • Scheduler service: General Availability release
  • Automation: Preview release of new Azure Automation service

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Web Sites: SSL now included at no additional charge in Standard Tiers

With Azure Web Sites you can host up to 500 web-sites in a single standard tier hosting plan.  Azure web-sites run in VMs isolated to host only your web applications (giving you predictable performance and security isolation), and you can scale-up/down the number of VMs either manually or using our built-in AutoScale functionality.  The pricing for standard tier web-sites is based on the number of VMs you run – if you host all 500 web-sites in a single VM then all you pay for is for that single VM, if you scale up your web site plan to run across two VMs then you’d pay for two VMs, etc.

Prior to this month we charged an additional fee if you wanted to enable SSL for the sites.  Starting this month, we now include the ability to use 5 SNI based SSL certificates and 1 IP based SSL certificate with each standard tier web site hosting plan at no additional charge.  This helps make it even easier (and cheaper) to SSL enable your web-sites.

Web Sites: Traffic Manager Support

I’ve blogged in the past about the Traffic Manager service we have built-into Azure. 

The Azure Traffic Manager service allows you to control the distribution of user traffic to applications that you host within Azure. This enables you to run instances of your applications across different azure regions all over the world.  Traffic Manager works by applying an intelligent routing policy engine to the Domain Name Service (DNS) queries on your domain names, and maps the DNS routes to the appropriate instances of your applications (e.g. you can setup Traffic Manager to route customers in Europe to a European instance of your app, and customers in North America to a US instance of your app).

You can use Traffic Manager to improve application availability - by enabling automatic customer traffic fail-over scenarios in the event of issues with one of your application instances.  You can also use Traffic Manager to improve application performance - by automatically routing your customers to the closet application instance nearest them.

We are excited to now provide general availability support of Traffic Manager with Azure Web Sites.  This enables you to both improve the performance and availability of your web-sites.  You can learn more about how to take advantage of this new support here.

Web Sites: Java Support

This past week we added support for an additional server language with Azure Web Sites – Java.  It is now easy to deploy and run Java web applications written using a variety of frameworks and containers including:

  • Java 1.7.0_51 – this is the default supported Java runtime
  • Tomcat 7.0.50 – the default Java container
  • Jetty 9.1.0

You can manage which Java runtime you use, as well as which container hosts your applications using the Azure management portal or our management APIs.  This blog post provides more detail on the new support and options.

With this announcement, Azure Web Sites now provides first class support for building web applications and sites using .NET, PHP, Node.js, Python and Java.  This enables you to use a wide variety of language + frameworks to build your applications, and take advantage of all the great capabilities that Web Sites provide (Easy Deployment, Continuous Deployment, AutoScale, Staging Support, Traffic Manager, outside-in monitoring, Backup, etc).

Web Sites: Support for Wildcard DNS and SSL Certificates

Azure Web Sites now supports the ability to map wildcard DNS and SSL Certficates to web-sites.  This enables a variety of scenarios – including the ability to map wildcard vanity domains (e.g. *.myapp.com – for example: scottgu.myapp.com) to a single backend web site.  This can be particularly useful for SaaS based scenarios.

Scott Cate has an excellent video that walks through how to easily set this support up.

Web Sites: New Basic Tier Pricing Option

Earlier in this post I talked about how we are now including the ability to use 5 SNI and 1 IP based SSL certificate at no additional cost with each standard tier azure web site hosting plan.  We have also recently announced that we are also including the auto-scale, traffic management, backup, staging and web jobs features at no additional cost as part of each standard tier azure web site hosting plan as well.  We think the combination of these features provides an incredibly compelling way to securely host and run any web application.

New Basic Tier Pricing Option

Starting this month we are also introducing a new “basic tier” option for Azure web sites which enables you to run web applications without some of these additional features – and at 25% less cost.  We think the basic tier is great for smaller/less-sophisticated web applications, and enables you to be successful while paying even less. 

For additional details about the Basic tier pricing, visit the Azure Web sites pricing page.  You can select which tier your web-site hosting plan uses by clicking the Scale tab within the Web Site extension of the Azure management portal.

Virtual Machines: Create from Visual Studio

With the most recent Azure SDK 2.3 release, it is now possible to create Virtual Machines from directly inside Visual Studio’s Server Explorer.  Simply right-click on the Azure node within it, and choose the “Create Virtual Machine” menu option:

image

This will bring up a “Create New Virtual Machine” wizard that enables you to walkthrough creating a Virtual Machine, picking an image to run in it, attaching it to a virtual network, and open up firewall ports all from within Visual Studio:

image

Once created you can then manage the VM (shutdown, restart, start, remote desktop, enable debugging, attach debugger) all from within Visual Studio:

image

This makes it incredibly easy to start taking advantage of Azure without having to leave the Visual Studio IDE.

Virtual Machines: Integrated Puppet and Chef support

In a previous blog post I talked about the new VM Agent we introduced as an optional extension to Virtual Machines hosted on Azure.  The VM Agent is a lightweight and unobtrusive process that you can optionally enable to run inside your Windows and Linux VMs. The VM Agent can then be used to install and manage extensions, which are software modules that extend the functionality of a VM and help make common management scenarios easier. 

At last week’s Build conference we announced built-in support for several new extensions – including extensions that enable easy support for Puppet and Chef.  Puppet and Chef allow developers and IT administrators to define and automate the desired state of infrastructure configuration, making it effortless to manage 1000s of VMs in Azure.

Enabling Puppet Support

We now have a built-in VM image within the Azure VM gallery that enables you to easily stand up a puppet-master server that you can use to store and manage your infrastructure using Puppet.  Creating a Puppet Master in Azure is now easy – simply select the “Puppet Enterprise” template within the VM gallery:

image

You can then create new Azure virtual machines that connect to this Puppet Master.  Enabling this with VMs created using the Azure management portal is easy (we also make it easy to do with VMs created with the command-line).  To enable the Puppet extension within a VM you create using the Azure portal simply navigate to the last page of the Create VM from gallery experience and check the “Puppet Enterprise Agent” extension within it:

image

Specify the URL of the Puppet master server to get started. Once you deploy the VM, the extension will configure the puppet agent to connect to this Puppet master server and pull down the initial configuration that should be used to configure the machine.

This new support makes it incredibly easy to get started with both Puppet and Chef and enable even richer configuration management of your IaaS infrastructure within Azure.

Virtual Machines: Basic Tier

Earlier in this blog post I discussed how we are introducing a new “Basic Tier” option for Azure Web Sites.  Starting this month we are also introducing a “Basic Tier” for Virtual Machines as well.

The Basic Tier option provides VM options with similar CPU + memory configuration options as our existing VMs (which are now called “Standard Tier” VMs) but do not include the built-in load balancing and AutoScale capabilities.  They also cost up to 27% less.  These instances are well-suited for production applications that do not require a built-in load balancer (you can optionally bring your own load balancer), batch processing scenarios, as well as for dev/test workloads.  Our new Basic tier VMs also have similar performance characteristics to AWS’s equivalent VM instances (which are less powerful than the Standard tier VMs we have today).

Comprehensive pricing information is now available on the Virtual Machines Pricing Details page.

Networking: General Availability of Azure Virtual Network Dynamic Routing VPN Gateways and Point-to-Site VPN

Last year, we previewed a feature called DynamicRouting Gateway and Point-to-Site VPN that supports Route-based VPNs and allows you to connect individual computers to a Virtual Network in Azure. Earlier this month we announced that the feature is now generally available. The DynamicRouting VPN Gateway in a Virtual Network will now carry the same 99.9% SLA as the StaticRouting VPN Gateway.

clip_image037

Now that we’re in General Availability mode, DynamicRouting Gateway will automatically incur standard Gateway charges which will take effect starting May 1, 2014. 

For further details on the service, please visit the Virtual Network website.

Mobile Services: Visual Studio Support for Mobile Services .NET Backend

With Visual Studio 2013 Update 2, you can now create your backend Mobile Service logic using .NET and the ASP.NET Web API framework in Visual Studio, using Mobile Services templates and scaffolds. Mobile Services support for .NET on the backend offers the following benefits:

  1. You can use ASP.NET Web API and Visual Studio together with Mobile Services to add a backend to your mobile app in minutes
  2. You can publish any existing Web API to Mobile Services and benefit from authentication, push notifications and other capabilities that Mobile Services provides. You can also take advantage of any Web API features like OData controllers, or 3rd party Web API-based frameworks like Breeze.
  3. You can debug your Mobile Services .NET backend using Visual Studio running locally on your machine or remotely in Azure.
  4. With Mobile Services we run, manage and monitor your Web API for you. Azure will automatically notify you if we discover you have a problem with your app.
  5. With Mobile Services .NET support you can store your data securely using any data backend of your choice: SQL Azure, SQL on Virtual Machine, Azure Table storage, Mongo, et al.

It’s easy to get started with Mobile Services .NET support in Visual Studio. Simply use the File-New Project dialog and select the Windows Azure Mobile Service project template under the Cloud node.

clip_image012

Choose Windows Azure Mobile Service in the New ASP .NET Project dialog.

clip_image014

You will see a Mobile Services .NET project, notice this is a customized ASP .NET Web API project with additional Mobile Service NuGet packages and sample controllers automatically included:

clip_image016

Running the Mobile Service Locally

You can now test your .Net Mobile Service project locally. Open the sample TodoItemController.cs in the project. The controller shows you how you can use the built-in TableController<T> .NET class we provide with Mobile Services. Set a breakpoint inside the GetAllTodoItems() method and hit F5 within Visual Studio to run the Mobile Service locally.

clip_image018

Mobile Services includes a help page to view and test your APIs. On the help page, click on the try it out link and then click the GET tables/TodoItem link. Then click try this out and send on the GET tables/TodoItem page. As you might expect, you will hit the breakpoint you set earlier.

clip_image020

Add APIs to your Mobile Service using Scaffolds

You can add additional functionality to your Mobile Service using Mobile Service or generic Web API controller scaffolds through the Add Scaffold dialog (right click on your project and choose Add -> New Scaffolded Item… command)

clip_image022

Publish your Mobile Services project to Azure

Once you are done developing your Mobile Service locally, you can publish it to Azure. Simply right click on your project and choose the Publish command. Using the publish wizard, you can publish to a new or existing Azure Mobile Service:

clip_image024

Remote debugging

Just like Cloud Services and Websites, you can now remote debug your Mobile Service to get more visibility into how your code is operating live in Azure. To enable remote debugging for a Mobile Service, publish your Mobile Service again and set the Configuration to Debug in the Publish wizard.

clip_image027

Once your Mobile Service is published and running live in the cloud, simply set a breakpoint in local source code. Then use Visual Studio’s Server Explorer to select the Mobile Service instance deployed in the cloud, right click and choose the Attach Debugger command.

clip_image028

Once the debugger attaches to the mobile service, you can use the debugging capabilities of Visual Studio to instantly and in-real time debug your app running in the cloud.

To learn more about Visual Studio Support for Mobile Services .NET backend follow tutorials at:

This new .NET backend supports makes it easy to create even better mobile applications than ever before.

Mobile Services: Offline Support

In addition to the above support, we are also introducing a preview of a new Mobile Services Offline capability with client SDK support for Windows Phone and Windows Store apps.

With this functionality, mobile applications can create and modify data even when they are offline/disconnected from a network. When the app is back online, it can synchronize local changes with the Mobile Services Table APIs. The feature also includes support for detecting conflicts when the same record is changed on both the client and the backend.

To use the new Mobile Services offline functionality, set up a local sync store. You can define your own sync store or use the provided SQLite-based implementation.  The Mobile Services SDK provides a new local table API for the sync store, with a symmetrical programming model to the existing Mobile Services Table API. You can use Optimistic Concurrency along with the offline feature to detect conflicting changes between the client and backend.

The preview of the Mobile Services Offline feature is available now as part of the Mobile Services SDK for Windows Store and Windows Phone apps. In the future, we will support all client platforms supported by Mobile Services, including iOS, Android, Xamarin, etc.

Mobile Services: Support for Azure Active Directory Sign On

We now support Azure Active Directory Single Sign On for Mobile Services.  Azure Active Directory authentication is available for both the .NET and Node.js backend options of Mobile Services.

To take advantage of the feature, first register your client app and your Mobile Service with your Azure Active Directory tenant using the Applications tab in the Azure Active Directory management portal.

clip_image030

In your client project, you will need to add the Active Directory Authentication Library (ADAL), currently available for Windows Store, iOS, and Android clients.

From there on, the token retrieved from ADAL library can be used to authenticate and access Mobile Services.  The single sign-on features of ADAL also enables your mobile service to make calls to other resources (such as SharePoint and Office 365) on behalf of the user.  You can read more about the new ADAL functionality here.

These new updates make Mobile Services an even more attractive platform for building powerful employee facing apps.

Notification Hub: Kindle Support and Visual Studio Integration

I’ve previously blogged about Azure Notification Hubs, a high scale cross platform push notification service that allows you to instantly send personalized push notifications to segments of your audience or individuals containing millions of iOS, Android, Windows, Widows Phone devices with a single API call.

Today we’ve made two important updates to Azure Notification Hubs: adding support for Amazon Kindle Fire devices, and Visual Studio support for Notification Hubs.

Support for Amazon’s Kindle

With today’s addition you can now configure your Notification Hubs with Amazon Device Messaging (ADM) service credentials on the configuration page for your Notification Hub in the Azure Management portal, and start sending push notifications to your app on Amazon’s Kindle device, in addition to iOS, Android, or Windows.

clip_image032

Testing Push Notifications with Visual Studio

Earlier I blogged about how we enabled debugging push notifications using the Azure Management Portal. With today’s Visual Studio update, you can now browse your notification hubs and send test push notifications directly from Visual Studio Server Explorer as well.

Simply select your notification hub in the Server Explorer of Visual Studio under the Notifications Hubs node.  Then right click, and choose the Send Test Notifications command:

clip_image033

In the notification hub window, you can then send a message either to a particular tag or all registered devices (broadcast). You can select from a variety of templates - Windows Store, Windows Phone, Android, iOS, or even a cross platform message using the Custom Template. After you hit Send, you’ll receive the message result instantly to help you diagnose if your message was successfully sent or not.

clip_image035

To learn more about Azure Notification Hubs, read tutorials here.

AutoScale: Announcing General Availability of Autoscale Service

Last summer we announced the preview release of our Autoscale service. I’m happy to announce that Autoscale is now generally available!  Better yet, there's no additional charge for using Autoscale.

We've added new features since we first released it as a preview version: support for both performance-and schedule-based autoscaling, along with an API and .NET SDK so you can programmatically scale using any performance counters that you define.

Autoscale supports all four Azure compute services: Cloud Services, Virtual Machines, Mobile Services and Web Sites. For Virtual Machines and Web Sites, Autoscale is included as a feature in the Standard pricing tiers, and for Mobile Services, it's included as a part of both Basic and Standard pricing tiers.

Storage: Announcing General Availability of Read Access Geo Redundant Storage (RA-GRS)

In December, we added the ability to allow customers to achieve higher read availability for their data. This feature called Read Access - Geo Redundant Storage (RA-GRS) allows you to read an eventually consistent copy of your geo-replicated data from the storage account’s secondary region in case of any unavailability to the storage account’s primary region.

Last week we announced that RA-GRS feature is now out of preview mode, and generally available. It is available to all Azure customers across all regions including the users in China.

RA-GRS SLA and Pricing

The benefit of using RA-GRS is that it provides a higher read availability (99.99+%) for a storage account over GRS (99.9+%). When using RA-GRS, the write availability continues to be 99.9+% (same as GRS today) and read availability for RA-GRS is 99.99+%, where the data is expected to be read from secondary if primary is unavailable. In terms of pricing, the capacity (GB) charge is slightly higher for RA-GRS than GRS, whereas the transaction and bandwidth charges are the same for GRS and RA-GRS. See the Windows Azure Storage pricing page here for more details about the SLA and pricing.

You can find more information on the storage blog here.

Active Directory: General Availability of Azure AD Premium

Earlier this month we announced the general availability of Azure Active Directory Premium, which provides additional identity and access management capabilities for enterprises. Building upon the capabilities of Azure AD, Azure AD Premium provides these capabilities with a guaranteed SLA and no limit on directory size. Additional capabilities include:

  • Group-based access assignment enables administrators to use groups in AD to assign access for end users to over 1200 cloud applications in the AD Application Gallery. End users can get single-sign on access to their applications from their Access Panel at https://myapps.microsoft.com or from our iOS application.
  • Self-service password reset that enables end users to reset forgotten passwords without calling your help desk.
  • Delegated group management that enables end users to create security groups and manage membership in security groups they own.
  • Multi-Factor Authentication that lets you easily deploy a Multi-Factor Authentication solution for your business without deploying new software or hardware.
  • Customized branding that lets you include your organization’s branding elements in the experiences that users see when signing in to AD or accessing their Access Panel.
  • Reporting, alerting, and analytics that increase your visibility into application usage in your organization, and potential security concerns with user accounts.

Azure AD Premium also includes usage rights for Forefront Identity Manager Server and Client Access Licenses.

To read more about AD Premium, including how to acquire it, read the Active Directory Team blog.

Active Directory: Public Preview of Azure Rights Management Service

Earlier this month we announced the public preview of the ability to manage your Azure Rights Management service within the Azure Management Portal. If your organization has Azure Rights Management either as a stand-alone service or as part of your Office 365 or EMS subscriptions you can now manage it by signing into the Azure Management Portal. Once in the Portal, select ACTIVE DIRECTORY in the left navigation bar, navigate to the RIGHTS MANAGEMENT tab, then click on the name of your directory.

clip_image039

With this preview you can now create custom rights policy templates that let you define who can access sensitive documents, and what permissions (view, edit, save, print, and more) users can have on those documents.  To begin creating a rights policy template, in the Quick Start page, click on Create an additional rights policy template option and follow the instructions on the page to define a name and description for the template, add users and rights and define other restrictions.

clip_image041

Once your template has been created and published, it will become available to users in your organization in their favorite applications.

clip_image043

To learn more managing Azure Rights Management and the benefits it offers to organizations, see the Information Protection group’s blog

Scheduler: General Availability Release Scheduler Service

This month we’ve also delivered the General Availability release of the Azure Scheduler service.  Scheduler allows you to run jobs on simple or complex recurring schedules that can invoke HTTP/S endpoints or post messages to storage queues. Scheduler has built-in high availability and can reliably call services inside or outside of Azure.

During preview customers have used it for a wide set of scenarios including for invoking services in their backend for Hadoop workloads, triggering diagnostics cleanup, and periodically checking that partners have submitted content on time. ISVs have used it to empower their applications to add scheduling capabilities such as report generation and sending reminders.

In the Scheduler portal extension you can easily create and manage your scheduler jobs. Since the initial release, Scheduler has also added the ability to update HTTP jobs with custom headers and basic authentication. It has also exposed the ability to change the recurrence schedule which will allow you to also choose to limit the execution of a job or allow the job to run infinitely.

With the general availability, new Azure Scheduler cmdlets have been released with Azure PowerShell and the Scheduler .NET API has been included in WAML 1.0.

I highly encourage you to try out the Scheduler today. You might find the following links helpful:

It makes scheduling recurring tasks really easy.

Automation: Announcing Microsoft Azure Automation Preview

Last week we announced the preview of a new Microsoft Azure service: Automation.

Automation allows you to automate the creation, deployment, monitoring, and maintenance of resources in your Azure environment using a highly scalable and reliable workflow execution engine. The service can be used to orchestrate the time-consuming, error-prone, and frequently repeated tasks you’d otherwise accomplish manually across Microsoft Azure and third-party systems to decrease operational expense for your cloud operations.

To get started with Automation, you first need to sign-up for the preview on the Azure Preview page. Once you have been approved for the preview, you can sign in to the Management Portal and start using it. Automation is currently only available in the East-US data center, but we will add the ability to deploy to additional data centers in the future.

Authoring a Runbook

Once you have the Automation preview enabled on your subscription, you can easily get started automating by following a few simple steps:

Step 1: In the Microsoft Azure management portal, click New->App Services->Automation->Runbook->Quick Create to create a new runbook. Runbooks are collections of activities that provide an environment for automating everything from diagnostic logging to applying updates to all instances of a virtual machine or web role to renewing certificates to cleaning storage accounts. Enter a name and description for the runbook, and create a new Automation account which will store your Runbooks, Assets, and Jobs.

Next time you create a runbook you can either use the same Automation account as you just created or create a separate one to if you’d like to maintain separation between a few different collections of runbooks / assets.

clip_image045

clip_image047

Step 2: Click on your runbook, then click Author->Draft. Type some PowerShell commands in the editor, then hit ‘Publish’ to make this runbook draft available for production execution.

clip_image049

Starting a Runbook and Viewing the Job

1. To start the runbook you just published, go back to the ‘Runbooks’ tab, click on your newly-published runbook, and hit ‘Start.’ Enter any required parameters for the runbook, then click the checkmark button.

clip_image051

2. Click on your runbook, then click on the ‘Jobs’ tab for this runbook. Here you can view all the instances of a runbook that have run, called jobs. You should see the job you just started.

clip_image053

3. Click on the job you just started to view more details about its execution. Here you can see the job output, as well as any exceptions that may have occurred while the job was executing.

clip_image055

Once you get familiar with the service, you’ll be able to create more sophisticated runbooks to automate your scenarios. I encourage you to try out Microsoft Azure Automation today.

For more information, click through the following links:

Summary

This most recent release of Azure includes a bunch of great features that enable you to build even better cloud solutions.  If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Azure: ExpressRoute Dedicated Networking, Web Site Backup Restore, Mobile Services .NET support, Hadoop 2.2, and more

This morning we released a massive amount of enhancements to Windows Azure.  Today’s new capabilities and announcements include:

  • ExpressRoute: Dedicated, private, high-throughput network connectivity with on-premises
  • Web Sites: Backup and Restore Support
  • Mobile Services: .NET support, Notification Hub Integration, PhoneGap support
  • HDInsight: Hadoop 2.2 support
  • Management: Co-admin limit increased from 10->200 users
  • Monitoring: Service Outage Notifications Integrated within Management Portal
  • Virtual Machines: VM Agent and Background Information Support
  • Active Directory: More SaaS apps, more reports, self-service group management
  • BizTalk Services: EDIFACT protocol support, Service Bus Integration, Backup and Restore

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

ExpressRoute: Dedicated, private, high-throughput network connectivity with on-premises

Today we delivered the public preview of our new ExpressRoute service.  ExpressRoute enables dedicated, private, high-throughput network connectivity between Azure datacenters and your on-premises IT environments. Using ExpressRoute, you can connect your existing datacenters to Azure without having to flow any traffic over the public Internet, and enable–guaranteed network quality-of-service and the ability to use Azure as a natural extension of an existing private network or datacenter.

Starting today you can establish dedicated connections though Equinix datacenters, or add Azure services to your MPLS VPN provided by AT&T.  We are also today announcing a new strategic partnership with Level3 to enable private connection connectivity through Level3 Cloud Connect Solutions. 

Configuring ExpressRoute

With today’s release we’ve made some updates to our Virtual Network service to enable you to configure connections to your local networks through ExpressRoute circuits.

When creating a new Virtual Network, you now have the option to configure ExpressRoute when selecting the Site-To-Site VPN option:

image

For an already created Virtual Network, you can also configure your site-to-site connection to use ExpressRoute in the Configure tab.

Once you’ve enhanced a site-to-site connection with ExpressRoute, all VMs or Cloud Services deployed in that Virtual Network will be able to connect to the remote network over the dedicated connection – enabling faster speeds, lower network latency and complete network isolation. If your subscription doesn’t have access to ExpressRoute yet, you can sign up to use it here.

Web Sites: Backup and Restore Support

Last month we added two great new capabilities to Windows Azure Web Sites – support for Staged Publishing (which enables atomic deployments), and Web Jobs (which enable background tasks).

With today’s release we are adding another great new capability to Web Sites - backup and restore support. The new web site backup support enables you to save a snapshot version of your web app – along with any SQL or MySQL databases it uses.  You can manually perform backups, or setup an automated rule to have them backed up automatically (e.g. once a day at midnight).  You can then choose to later restore a web site to a previous state, or alternatively create a new web site based on one of your original site's backups. 

This new Backup and Restore capability is available at no additional cost to all Web Site customers running using our Standard Tier.  It provides a great way to run your web apps with even more confidence.

Enabling Backup Support

Enabling backup support with a web-site is easy.  Simply navigate to the new “Backups” tab within a web-site:

image

Click the “Backup Now” option in the command-tray to manually perform a backup.  Or set the automated backup option to true, configure the time of day you wish to perform the backup, and then click the save button in order to setup an automated backup rule.

Mobile Services: .NET Support, Notification Hub Integration, PhoneGap Support

Today we are releasing another round of great updates to Windows Azure Mobile Services.  These updates include:

  • .NET support: You can now write your backend logic using ASP.NET Web API and run it using Mobile Services
  • Notification Hubs integration: Mobile Services now use Notification Hubs for push notifications, which enables an even richer set of push notification scenarios
  • Integrated PhoneGap support: You can now easily integrate PhoneGap apps with Mobile Services

More details on each of these below:

.NET Support

Starting today we now provide full support for writing your backend Mobile Service logic using .NET and the ASP.NET Web API framework.  This provides the following benefits:

  1. You can use ASP.NET Web API and Visual Studio together with Mobile Services to build great mobile apps
  2. You can publish any existing Web API to Mobile Services and integrate additional Mobile Services features like mobile authentication and push notifications
  3. You can take full advantage of Web API features like OData controllers, and 3rd party Web API-based frameworks like Breeze
  4. You can debug your Mobile Services .NET backend using Visual Studio running locally on your machine or remotely in Azure
  5. With Mobile Services we run, manage, monitor and scale your Web API for you.

The combination of ASP.NET Web API and Mobile Services delivers a mobile backend story that is both super powerful and really easy to use.

Getting Started with Mobile Services using .NET

It’s easy to get started with Mobile Services using today’s new .NET support. Simply go to the Windows Azure Management Portal and create a new Mobile Service (New->Compute->Mobile Service). On the first screen of the create wizard choose the new .NET option as your backend language:

image

When your new Mobile Service is created, you’ll be presented with a helpful quick start page:

image

To easily get started using .NET as your backend language, click to download the sample project listed in the quick-start page above.

Unzip the downloaded package and open the solution file. You will see a Mobile Services .NET template project. Notice this is simply an ASP.NET Web API project with additional Mobile Service NuGet packages included:

image

Note: in a future update we will provide even richer Mobile Service tooling support within Visual Studio.  This will provide additional Mobile Service tooling features on top of the standard Web API project support.  With today’s preview though you’ll just use the standard Web API project template within Visual Studio already.

Running the Mobile Service Locally

Open the TodoItemController.cs controller file in the project you downloaded and examine its content. This controller shows you how can use the built-in TableController<T> .NET class we now provide with Mobile Services that enables easy remote data scenarios (note: you can also skip using this and just derive your controllers from the standard Controller base class and use an existing data API like EF, NHibernate or others). 

The default TodoItemController.cs in the project already has scaffold support for all of the key CRUD methods for a TodoItems resource.

image

Set a breakpoint inside the GetAllTodoitems() method. Then hit F5 within Visual Studio to run the Mobile Service locally. Mobile Services supports a local help page for the Web API Controllers you include in your project. This makes it really easy to test things out locally.

Click on the GetAllTodoItems link within the help page to bring up method documentation for the above Web API Controller. Click on the Test API link within the help page to invoke the GetAllTodoItems API and test it out. As you might expect, you will hit the breakpoint you’ve set up earlier.  The ability to develop and test locally, and debug all operations, makes it really easy to develop solutions.

Publishing your Mobile Service to Azure

Once you are done developing your Mobile Service locally, you can publish it to Azure. 

In a future update we will provide integrated Mobile Services publishing support directly within Visual Studio.  With today’s release the easiest way to publish it to go to Mobile Services dashboard in the Windows Azure Management Portal and download the Web Deploy publish settings file:

image

Once you download the publish settings file, simply right-click on your Web API project within the VS Solution Explorer, and then click the Publish context menu command.  Within the publish wizard you can select the publish file you downloaded, which will enable you to easily deploy the Mobile Service to Azure.

To learn more about Mobile Services .NET support check out these tutorials:

Notification Hub Integration

With today’s release we are making it really easy to use Notification Hubs with Mobile Services.  This integration simplifies many common scenarios and removes the need to explicitly manage push channels.  It also provides Mobile Service customers with more powerful features including:

  • Advanced targeting using tags and tag expressions
  • Broadcast push support at high scale
  • Personalization and localization using templates

Today’s Notification Hub integration is still a preview.  You can enable it using the push tab of your Mobile Service:

image

Once enabled you can easily send push notifications to any or all users you wish with a single API call in the backend (using either the .NET or Node.js based API).

Integrated PhoneGap Support

Mobile Services already provides support for a number of cross-platform mobile client frameworks, including Xamarin, PhoneGap, Sencha. Today we added an integrated PhoneGap quick start in the Azure management portal, which will significantly simplify developing cross-platform mobile apps with PhoneGap and Mobile Services:

image 

HDInsight: Hadoop 2.2 Support

HDInsight is our 100% compatible Apache Hadoop-based distribution for Windows Azure.  With HDInsight you can leverage data stored in Windows Azure Blob Storage or the native HDFS file system local to the compute nodes and crunch massive amounts of data.

We now support Hadoop 2.2 clusters (in preview mode) with our HDInsight service.  This new update provides an order of magnitude (up to 40x) faster query response times, much better data compression (up to 80%), and enables you to leverage the benefits of YARN.

You can learn more about the Hadoop 2.2 improvements from our data team’s announcement blog post as well as by following this tutorial.

Management: Co-admin support increased from 10 to 200 administrators per subscription

Last fall I blogged about a number of Azure enhancements we had rolled out to enable a better enterprise authentication experience. These improvements included the ability to associate Azure subscriptions with Active Directory tenants, and to enable Active Directory SSO into Azure along with multi-factor authentication support.

Today we are making another nice management improvement – which is to increase the number of co-administrators that can be enabled on each Azure subscription to 200 (previously we only supported 10 co-admins per subscription).  The increased co-admin limit will make it easier for large teams to share a single Azure subscription, and simplify a number of subscription management scenarios.

Monitoring: Service Outage Notifications integrated within Management Portal

Service outages unfortunately sometimes happen with the cloud.  One of the asks we’ve heard from customers has been to improve the notification process when a service has an issue and to provide better real-time, per-user customized, information on status.  Rather than just learning that an abstract service is having an issue, you want to know if it is the particular service instance your app is using – and if so what the latest health status is with it.

With today’s release, we are introducing Azure incident notification support directly in the Azure Management Portal – and customizing it based on the particular service instances you are using. When a service outage incident occurs that affects your apps, you will now see a notification in the Portal:

image

We will surface this alert for the following types of incidents:

  • Partial Performance Degradation
  • Partial Service Interruption
  • Full Performance Degradation
  • Full Service Interruption
  • Advisory

If you click OK within the notification window, you will see a dialog that provides more details about the incident(s):

image

This dialog will include key information such as the timestamp of the incident, name of the service and the incident type, description of the latest update related to the incident, and the SubscriptionID (where available) of the subscriptions you have that use the service in question. With this release, the SubscriptionID will be provided for incidents involving Virtual Machines, Cloud Services, Storage, SQL Databases, Service Bus and Web Sites. You may see “Not Available” for other services, but we are working to add these in the future releases.

From this incident details dialog, you can navigate to the Operation Logs page by clicking on the link at the bottom of the dialog. This page will give you the filtered view of history for incidents that carry the same SubscriptionID information.  This will allow you to see full details for every past incident involving this service (along with start and end times of the incidents).

We will continue to enhance this feature set over the next few releases to fold in all Azure Services to make it easy for you to detect outages and updates that pertain to your specific service(s) on Azure.

Virtual Machines: VM Agent and Background Info Extension

With today’s release we are adding a new feature that helps make managing Virtual Machines even more powerful: VM Agent support.

For those of you who use Cloud Services with web and worker roles, you may know that we already use an agent inside these workloads to facilitate certain management features.  With today’s release we are introducing a new VM Agent for IaaS VMs that over time will bring this same kind of managed functionality to Virtual Machines as well.

The VM Agent is a lightweight and unobtrusive process that you can optionally enable to run inside your Windows and Linux VMs. The VM Agent can then be used to install and manage extensions, which are software modules that extend the functionality of a VM and help make common management scenarios easier.  Over the next several months you’ll see us deliver many new extensions that you can optionally enable within your virtual machines.

The VM Agent is automatically installed when creating a VM from Quick Create. You can opt-out of installing the VM Agent by creating a VM using the From Gallery option and unselecting the “Install the VM Agent” checkbox:

image

Background Info Extension

One small but useful extension that we have enabled with today’s VM Agent release is one we call “BGInfo”.  This extension helpfully displays information about a Windows VM on the desktop of the VM instance when you RDP into it – providing an easy way to quickly figure out the VM’s configuration settings (internal and public IP, disk space, memory, deployment ID, etc):

clip_image001

Over the next several months you’ll see us continue to ship additional extensions that extend the management support of VMs even further.

Active Directory: More apps, more reports and self-service group management

With today’s release we’ve updated Windows Azure Active Directory to support SSO integration with more SaaS apps, and enhanced the Windows Azure Active Directory Premium tier (which is currently in preview) with more built-in reports and added end-user self-service and delegated group management.

Enabling Active Directory SSO to SaaS applications

We now enable Active Directory single-sign-on (SSO) support with over 600 SaaS popular apps. To integrate these applications with your organization's Active Directory, select your Active Directory within the Windows Azure Management Portal, change to the Applications tab, then click the Add button:

image

Then choose Add an application for my organization to use.  This will allow you to pick from 600+ popular SaaS applications to integrate with:

image

Once an app has been integrated with your Active Directory, you can select which users in your directory can sign into the app.  Once you do this, the app will appear on the access panel for each user logged into the http://myapps.microsoft.com site – enabling them to sign-in and begin using it using their corporate credentials.

Premium Security and Usage Reports

Windows Azure AD Premium is designed to address the identity and access management needs of enterprises. It is currently in preview, and you can use its features including tenant branding and self-service password reset while in preview at no charge. At the end of the preview it will be converted to a paid service.

To find out more about how to get started with Windows Azure AD Premium preview, see this earlier blog post on the Active Directory blog. Briefly, there’s a two-step process to evaluate this preview. First, navigate to Windows Azure Preview Feature page and add Windows Azure Active Directory Premium to your subscription by clicking "try it now", selecting the "Free Trial" or other subscription. Then, in the Windows Azure Management Portal, select a directory and on the Configure tab of the directory, move the slider for Premium features to enabled.

image

Usage Reports

With today’s update we’ve added new reports to the Windows Azure Active Directory Premium tier that will help you better understand how your organization’s users are accessing applications.

You can now click the Reports tab to see additional views which highlight potential account compromise scenarios. These reports show sign-ins from IP addresses with suspicious activity, irregular sign-in activity, and a list of users whose accounts may have been compromised.

Delegated and self-service group management in Windows Azure AD Premium

With today’s release we’ve also added delegated and self-service group management support as part of the Premium preview. In previous updates we enabled administrators to view and manage groups in the Windows Azure Management Portal - now we’re enabling end users within your organizations to create and join groups as well.

Once Premium preview is enabled on a directory, a user who is a member of the directory can get a group management experience by going to http://myapps.microsoft.com, then signing in and clicking on the Groups tab. The user will then see all of the groups that are present in the directory and he or she can request to join a group. They can also filter the view to show only groups of which they’re a member or groups they own:

image

image

A user can also create a new group.

image

These groups can be used to control access to SaaS applications or within applications themselves, such as in SharePoint Online. Currently these groups are not mail-enabled - we’ll add that functionality in a future release.

More updates on these and other features in Premium are on the Active Directory team blog.

BizTalk Services: EDIFACT Protocol Support, Service Bus Integration, Backup and Restore

With today’s release we are updating Windows Azure Biz Talk Services with a host of new features. If you are already using BizTalk Services, your environment will be automatically updated with the following new features:

EDIFACT Protocol Support and X12 Schema Updates

We now support EDIFACT messaging versions up to D10B natively in the platform. When you create a new EDI agreement you can now choose the target protocol as EDIFACT (instead of X12) and configure the agreement. Features such as Batching, Tracking, AS2 with EDIFACT are all supported with today’s update.

Support for X12 messaging up to 6030 and Message Type 999 in addition to 997 for acknowledgements is also now supported.

Pulling Messages from Service Bus Queues and Topics

Support for pulling messages from Service Bus Queues and Topics: This allows a BizTalk “Bridge” to pull messages directly from a Service Bus Queue or Topic without having to write to an intermediary service. After installing the new BizTalk Services SDK, the new Sources are available within the VS Toolbox of BizTalk Services projects:

image

This enables complex configuration such as the one below where messages can be pulled from an FTP endpoint, Service Bus Queue, Service Bus Topic, to be processed by the bridge and sent back to a FTP endpoint, Service Bus Queue or Service Bus Topic based on route rules:

image

Service Bus Shared Access Signatures (SAS) support with Service Bus Queues and Topics:

You can now use SAS keys to configure Service Bus Queues and Topics with Agreements and Bridges in the Azure Portal as well as in Visual Studio. 

BizTalk Adapter Services No Longer Needs SQL On Premises

Starting today all BizTalk Adapter configuration data is now stored in the cloud without any additional configuration in SQL express configuration required on-premises. For existing customers, the SDK installation provides an option to update/migrate the existing configuration to the cloud:

image

Backup and Restore Support

Backup and Restore operations within BizTalk Services can now be easily configured and managed through the Azure  management portal.  Backup and restore can be scheduled by following these five steps:

  1. Go to your deployment’s Configure page and flip the Backup status from None to Automatic.
  2. Add the storage account where you want the backup of the deployment to be stored
  3. Tweak the first occurrence and recurrence schedule
  4. Enter the retention period in days or leave the default to 20 days
  5. Hit Save on your configuration changes

Operation Log Support

You can now view all BizTalk Services management operations such as Create, Delete, Backup, etc. in the Azure Portal using the Management Services tab.  This makes it easy to audit and review all management operations performed with the service.

We hope these features will add value to your integration scenarios and enrich your Biztalk Services experience. We would love to hear your feedback via BizTalk Services forums and user voice.

Summary

Today’s Windows Azure release enables a bunch of great new scenarios.

If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Windows Azure: Staging Publishing Support for Web Sites, Monitoring Improvements, Hyper-V Recovery Manager GA, and PCI Compliance

This morning we released another great set of enhancements to Windows Azure.  Today’s new capabilities and announcements include:

  • Web Sites: Staged Publishing Support and Always On Support
  • Monitoring Improvements: Web Sites + SQL Database Alerts
  • Hyper-V Recovery Manager: General Availability Release
  • Mobile Services: Support for SenchaTouch
  • PCI Compliance: Windows Azure Now Validated for PCI DSS Compliance

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Web Sites: Staged Publishing Support

With today’s release, you can now enable staged publishing to your Windows Azure Web Sites.  This new feature is really powerful, and enables you to deploy updates of your web apps/sites to a staging version of the site that can be accessed via a URL that is different from your main site.  You can use this staged site to test your site/app deployment and then, when ready, instantaneously swap the content and configuration between the live site and the staging version. 

This new features enables you to deploy changes with more confidence.  And it ensures that your site is never in an inconsistent state (where some files have been updated and others not) - now you can immediately swap all changes to all of the files in one shot.

Enabling Staged Publishing Support

To setup staged publishing go to the DASHBOARD tab of a web site and click Enable staged publishing from the quick glance section:

image

Clicking this link will cause Azure to create a new staging version of the web-site and link it to the existing site.  This linkage is represented in the navigation of the Windows Azure Management Portal – the staging site will show up as a sub-node of the primary site:

image

If you look closely at the name of the staging site, you’ll notice that its URL by default is sitename-staging (e.g. if the primary site name was “scottgu”, the staging site would be “scottgu-staging”):

image

You can optionally map any custom DNS name you want to the staging site (using either a C-Name or A-Record) – just like you would a normal site.  So your staging domain doesn’t have to have an azurewebsites.net extension.  In the scenario above I could remap the staging domain to be staging.scottgu.com, or scottgu-staging.com, or even foobar.scottgu.com if I wanted to. 

The staging URL doesn’t change between deployments of a site – so you can configure a custom DNS once, and then you can use it across all subsequent deployments.  You can also optionally enable SSL on the staging site and upload a SSL certificate to use with the staging domain (ensuring you can fully test/validate your SSL scenarios before swapping live).

Configuring the Staging Site

You can click on the staging site to manage it just like any other site:

image 

The SCALE tab and the LINKED RESOURCES tabs are disabled for staging sites, but all other tabs work as expected.  You can use the CONFIGURE tab to set configuration settings like database and application connection-strings (if you set these at the site level they override anything you might have in a web.config file).

One thing you’ll also notice when you open the staging site is that there is a new SWAP button in the bottom command-bar of it – we’ll talk about how to use that in a little bit.

Deploying to the Staging Site

Deploying a new instance of your web-app/site to the staging site is really easy.  Simply deploy to it just like you would any normal site.  You can use FTP, the built-in “Publish” dialog inside Visual Studio, Web Deploy or Git, TFS, VS Online, GitHub, BitBucket, DropBox or any of the other deployment mechanism we already support.  You configure these just like you would a normal site.

Below I’m going to use the built-in VS publish wizard to publish a new version of the site to the staging site:

image

Once this new version of the app is deployed to the staging site we can access a page in it using the staging domain (in this case http://scottgu-staging):

image

Note that the new version of the site we deployed is only in the staging site.  This means that if we hit the primary site domain (in this case http://scottgu) we wouldn’t see this new “V2” update - it would instead show any older version that had been previously deployed:

image

This allows us to do final testing and validation of the staging version without impacting users visiting the live production site.

Swapping Deployments

At some point we’ll be ready to roll our staged version to be the live production site version.  Doing this is easy – all we need to do is push the SWAP button within the command-bar of either our live site or staging site using the Windows Azure Portal (you can also automate this from the command-line or via a REST call):

image

When we push the SWAP button we’ll be prompted with a confirmation dialog explaining what is about to happen:

image

If we confirm we want to proceed with the swap, Azure will immediately swap the content of the live site (in this case http://scottgu) with the newer content in the staging site (in this case http://scottgu-staging).  This will take place immediately – and ensure that all of the files are swapped in a single shot (so that you never have mix-matched files).

Some settings from the staged version will automatically copy to the production version – including things like connection string overrides, handler mappings, and other settings you might have configured.  Other settings like the DNS endpoints, SSL bindings, etc will not change (ensuring that you don’t need to worry about SSL certs used for the staging domain overriding the production URL cert, etc).

Once the swap is complete (the command takes only a few seconds to execute), you’ll find that the content that was previously in the staging site is now in the live production site:

image

And the content that had been in the older live version of the site is now in the staging site.  Having the older content available in the staging site is useful – as it allows you to quickly swap it back to the previous site if you discover an issue with the version that you just deployed (just click the SWAP button again to do this).  Once you are sure the new version is fine you can just overwrite the staging site again with V3 of your app and repeat the process again.

Deployment with Confidence

We think you’ll find that the new staged publishing feature is both easy to use and very powerful, and enables you to handle deployments of your sites with an industrial strength workflow.

Web Sites: Always On Support

One of the other useful Web Site features that we are introducing today is a feature we call “Always On”.  When Always On is enabled on a site, Windows Azure will automatically ping your Web Site regularly to ensure that the Web Site is always active and in a warm/running state.  This is useful to ensure that a site is always responsive (and that the app domain or worker process has not paged out due to lack of external HTTP requests). 

It also useful as a way to keep a Web Site active for scenarios where you want to run background code within it irrespective of whether it is actively processing external HTTP customer requests.  We have another new feature we are enabling this week called “Web Jobs” that makes it really easy to now write this background code and run it within a Web Site. I’ll blog more about this feature and how to use it in the next few days.

You can enable Always On support for Web Sites running in Standard mode by navigating to the CONFIGURE tab within the portal, and then toggling the Always On button that is now within it:

image

Monitoring Improvements: Web Sites + SQL Database Alerts

With almost every release we make improvements to our monitoring functionality of Azure services. Today’s update brings two nice new improvements:

  1. Metrics updated every minute for Windows Azure Web Sites
  2. Alerting for more metrics from Windows Azure Websites and Windows Azure SQL Databases

Monitoring Data Every Minute

With today’s release we are now updating statistics on the monitoring dashboard of a Web Site every minute, so you can get much more fresh information on exactly how your website is being used (prior to today the granularity was not as fine grained):

image

Viewing data at this higher granularity can make it easier to observe changes to your website as they happen. No additional configuration is required to get data every minute – it is now automatically enabled for all Azure Websites.

Expanding Alerting

When you create alerts you can now choose between six different services:

  • Cloud Service
  • Mobile Service
  • SQL Database (New Today!)
  • Storage
  • Virtual Machine
  • Web Site (More Metrics Today!)

To get started with Alerting, click on the Management Services extension on the left navigation tab of the the Windows Azure Management Portal:

image

Then, click the Add Rule button in the command bar at the bottom of the screen. This will open a wizard for creating an alert rule. You can see all of the services that now support alerts:

image

New Web Site Alert Metrics

With today’s release we are adding the ability to alert on any metric that you see for a Web Site in the portal (previously we only supported alerts on Uptime and Response Time metrics). Today’s new metrics include support for setting threshold alerts for errors as well as CPU time and total requests:

image

The CPU time and Data Out metric alerts are particularly useful for Free or Shared websites – you can now use these alerts to email you if you’re getting close to exceeding your quotas for a free or shared website (and need to scale up instances).

New SQL Alert Metrics

With today’s release you can also now define alerts for your SQL Databases. For Web and Business tier databases you can setup alert metrics for the Storage for the database.  There are also now additional metrics and alerts for SQL Database Premium (which is currently in preview) such as CPU Cores and IOPS.

Once you’ve set up these new alerts, they behave just like alerts for other services. You’ll be informed when they cross the thresholds you establish, and you can see the recent alert history in the dashboard:

image

Windows Azure Hyper-V Recovery Manager: General Availability Release

I’m excited to announce the General Availability of Windows Azure Hyper-V Recovery Manager (HRM). This release is now live in production, backed by an enterprise SLA, supported by Microsoft Support, and is ready to use for production scenarios.

Windows Azure Hyper-V Recovery Manager helps protect your on premise applications and services by orchestrating the protection and recovery of Virtual Machines running in a System Center Virtual Machine Manager 2012 R2 and System Center Virtual Machine Manager 2012 SP1 private cloud to a secondary location. With simplified configuration, automated protection, continuous health monitoring and orchestrated recovery, Hyper-V Recovery Manager service can help you implement Disaster Recovery and recover applications accurately, consistently, and with minimal downtime.

image

The service leverages Hyper-V Replica technology available in Windows Server 2012 and Windows Server 2012 R2 to orchestrate the protection and recovery of Hyper-V Virtual Machines from one on-premise site to another on-premise site. Application data always travels on your on premise replication channel. Only metadata that is needed (such as names of logical clouds, virtual machines, networks etc.) for orchestration is sent to Azure. All traffic sent to/from Azure is encrypted.

Getting Started

To get started, use the Windows Azure Management Portal to create a Hyper-V Recovery Manager Vault. Browse to Data Services > Recovery Services and click New to create a New Hyper-V Recovery Manager Vault. You can name the vault and specify a region where you would like the vault to be created.

clip_image002

Once the Hyper-V recovery Manager vault is created, you’ll be presented with a simple tutorial that will help guide you on how to register your SCVMM Servers and configure protection and recovery of Virtual Machines.

clip_image004

To learn more about setting up Hyper-V Recovery Manager in your deployment follow our detailed step-by-step guide.

Key Benefits of Hyper-V Recovery Manager

Hyper-V Recovery Manager offers the following key benefits that differentiate it from other disaster recovery solutions:

  • Simple Setup and Configuration: HRM dramatically simplifies configuration and management operations across large number of Hyper-V hosts, Virtual Machines and data-centers.
  • Automated Protection: HRM leverages the capabilities of Windows Server and System Center to provide on-going replication of VMs and ensures protection throughout the lifecycle of a VM.
  • Remote Monitoring: HRM leverages the power and reach of Azure to provide a remote monitoring and DR management service that can be accessed from anywhere.
  • Orchestrated Recovery: Recovery Plans enables automated DR orchestration by sequencing failover of different application tiers and customization with scripts and manual actions.

New Improvements

The Hyper-V Recovery Manager service has been enhanced since the initial October Preview with several nice improvements:

  • Improved Failback Support: The Failback support has been improved in scenarios where the primary host cluster has been rebuilt after an outage.
  • Support for Kerberos based Authentication: Cloud configuration now allows selecting Kerberos based authentication for Hyper-V Replica. This is useful in scenarios where customers want to use 3rd party WAN optimization and compression and have AD trust available between primary and secondary sites.
  • Support for Upgrade from VMM 2012 SP1 to VMM 2012 R2: HRM service now supports upgrades from VMM 2012 SP1 to VMM 2012 R2.
  • Improved Scale: The UI and service has been enhanced for better scale support.

Please visit Windows Azure web site for more information on Hyper-V Recovery Manager. You can also refer to additional product documentation. You can visit the HRM forum on MSDN for additional information and engage with other customers.

Mobile Services: Support for SenchaTouch

I’m excited to announce that in partnership with our friends at Sencha, we are today adding support for SenchaTouch to Windows Azure Mobile Services. SenchaTouch is a well know HTML/JavaScript-based development framework for building cross-platform mobile apps and web sites. With today’s addition, you can easily use Mobile Services with your SenchaTouch app.

You can download Windows Azure extension for Sencha here, configure Sencha loader with the location of the azure extension, and add Azure package to your app.json file:

{ name : "Basic", requires : [ "touch-azure"]}

Once you have the Azure extension added to your Sencha project, you can connect your Sencha app to your Mobile Service simply by adding the following initialization code:

Ext.application({

    name: 'Basic',

    requires: ['Ext.azure.Azure'],

    azure: {

        appKey: 'myazureservice-access-key',

        appUrl: 'myazure-service.azure-mobile.net'

    },

    launch: function () {

        // Call Azure initialization

        Ext.Azure.init(this.config.azure);

    }

});

From here on you can data bind your data model to Azure Mobile Services, authenticate users and use push notifications. Follow this detailed Getting Started tutorial to get started with SenchaTouch and Mobile Services. Read more detailed documentation at Mobile Services Sencha extension resources page.

Windows Azure Now Validated for PCI DSS Compliance

We are very excited to announce that Windows Azure has been validated for compliance with the Payment Card Industry (PCI) Data Security Standards (DSS) by an independent Qualified Security Assessor (QSA).

The PCI DSS is the global standard that any organization of any size must adhere to in order to accept payment cards, and to store, process, and/or transmit cardholder data. By providing PCI DSS validated infrastructure and platform services, Windows Azure delivers a compliant platform for you to run your own secure and compliant applications. You can now achieve PCI DSS certification for those applications using Windows Azure.

To assist customers in achieving PCI DSS certification, Microsoft is making the Windows Azure PCI Attestation of Compliance and Windows Azure Customer PCI Guide available for immediate download.

Visit the Trust Center for a full list of in scope features or for more information on Windows Azure security and compliance.

Summary

Today’s release includes a bunch of great features that enable you to build even better cloud solutions.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Documentation Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Windows Azure: New Scheduler Service, Read-Access Geo Redundant Storage, and Monitoring Updates

This morning we released another nice set of enhancements to Windows Azure.  Today’s new capabilities include:

  • Scheduler: New Windows Azure Scheduler Service
  • Storage: New Read-Access Geo Redundant Storage Option
  • Monitoring: Enhancements to Monitoring and Diagnostics for Azure services

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Scheduler: New Windows Azure Scheduler Service

I’m excited to announce the preview of our new Windows Azure Scheduler service.  The Windows Azure Scheduler service allows you to schedule jobs that invoke HTTP/S endpoints or post messages to a storage queue on any schedule you define.  Using the Scheduler, you can create jobs that reliably call services either inside or outside of Windows Azure and run those jobs immediately, on a regular schedule, or set them to run at a future date.

To get started with Scheduler, you first need to sign-up for the preview on the Windows Azure Preview page. Once you enroll in the preview, you can sign in to the Management Portal and start using it.

Creating a Schedule Job

Once you have the Schedule preview enabled on your subscription, you can easily create a new job following a few short steps:

Click New->App Services->Scheduler->Custom Create within the Windows Azure Management Portal:

image

Choose the Windows Azure Region where you want the jobs to run from, and then select an existing job collection or create a new one to add the job to:

 image

You can then define you job action. In this case, we are going to create an HTTP action that will do a GET request against a web site (you can also use other HTTP verbs as well as HTTPS):

image

For processing longer requests or enabling a service to be invoked when offline, you may want to post a message to a storage queue rather than standing up and invoking a web service.  To post a message to a storage queue just choose Storage Queue as your action then create or select the storage account and queue to send a request to:

image

Once you’ve defined the job to perform, you’ll now want to setup the recurrence schedule for it. The recurrence can be as simple as run immediately (useful for testing), at a specific time in the future, or on a recurring schedule:

image

Once the job is created, the job will be listed in the jobs view:

image

The jobs view shows a summary status of failures/faults with any job – you can then click the history tab to get even more detailed status (including the HTTP response headers + body for any HTTP based job).

I encourage you to try out the Scheduler – I think you’ll find it a really useful way to automate jobs to happen in a reliable way.  The following links provide more information on how to use it (as well as how to automate the creation of tasks from the command-line or your own applications):

Storage: New Read-Access Geo Redundant Storage Option

I’m excited to announce the preview release of our new Read-Access Geo Redundant Storage (RA-GRS) option. RA-GRS is a major improvement to our Windows Azure Storage Geo Replicated Storage offering.  Prior to today, our Geo-Replicated Storage option provided built-in support for automatically replicating your storage data (blobs, queues, tables) from one primary region to another (for example: US East to US West), but access to the secondary location data wasn’t provided except in a disaster scenario which necessitated a storage cluster failover.

With today’s update you can now always have read-access to your secondary storage replica.  This enables you to have immediate access to your data in the event of a temporary failure in your primary storage location (and to build-in support within your applications to handle the read fail-over automatically).  Today’s update also enables you to test and track the replication of your data so you can easily verify the replication (which happens asynchronously in the background). 

Enabling Read Access

In order to enable RA-GRS support, you will need to sign up to the Read Access Geo Redundant Storage Preview on the Windows Azure Preview page. Once you enroll in the preview, you can sign in to the Management Portal and simply navigate to the Configure tab for your Storage Account to enable it on the Storage Account:

image

Once enabled you can access your secondary storage endpoint location as myaccountname-secondary.<service>.core.windows.net.  You can use the same access keys for the the secondary storage location as the ones for your primary storage endpoint.

For additional details on RA-GRS and examples of how to use it, read the storage blog post entry at http://blogs.msdn.com/b/windowsazurestorage/archive/2013/12/04/introducing-read-access-geo-replicated-storage-ra-grs-for-windows-azure-storage.aspx

Monitoring: Enhancements to Monitoring and Diagnostics for Azure services

Today’s update includes several nice enhancements to our monitoring and diagnostics capabilities of Windows Azure:

Monitoring metrics for Premium SQL Databases

With today’s update you can now monitor metrics for the CPU and IO activity of Premium SQL databases, and the storage activity of both Premium and Standard databases. You can find more details on MSDN.

clip_image002

Update to Web Site diagnostics

Previously, you could select an existing blob container when configuring the storage location for your web server HTTP logs.

image

With this release, you now can additionally create a new blob container to push your web server logs to in a single, consistent configuration experience within the Windows Azure Management Portal. You can do so by simply navigating to the configure tab for your web site, clicking on the manage storage button above, and selecting the option to create a new blob container.

image

Operation history support for Windows Azure Mobile Services

The Operation Logs feature of Windows Azure allows you to audit/log management operations performed on your Windows Azure Services.  You can review them be clicking on the Operating Logs tab within the Management Services extension of the Management Portal:

image

With today’s update we have added more than 20 new log actions for Windows Azure Mobile Services that will now show up in the operation logs list.

Summary

Today’s release includes a bunch of great features that enable you to build even better cloud solutions.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Presentations I’m doing in Dublin and London Dec 2nd->5th

I’ll be in Ireland and the UK next week presenting at several events.  Below are details on the talks I’ll be doing if you want to come along and hear them:

Dublin: Monday Dec 2nd

I’m doing two separate free events in Dublin on Monday:

  • Windows Azure and the Cloud at Mon 1-3pm.  This event is free to attend, and I’ll be doing a two hour keynote/overview session on Windows Azure as part of it.  This will be a great talk to attend if you are new to Windows Azure and are interested in learning more about what you can do with it.  Later sessions at the event also cover VS 2013, building iOS/Android apps with C# using Xamarin, and F# with Data and the Cloud.  Lean more here and sign-up for free.
  • Building Real World Application using Windows Azure at Mon 6:00-9:00pm.  This event is also free to attend, and during it I’ll walkthrough building a real world application using Windows Azure and discuss patterns and best practice techniques for building real world apps along the way.  The content is intermediate/advanced level (my goal is to melt your brain by the end) but doesn’t assume prior knowledge of Windows Azure.  Learn more here and sign-up for free.

There is no content overlap between the two talks – so feel free to attend both if you want to!

London: Wed Dec 4th and 5th

I’m presenting at the NDC London Conference on Dec 4th and Dec 5th as well.  This is a great developer conference being hosted in the UK for the first time.  It has a great line up of speakers attending.

I’m presenting 2 separate two-part talks: 

  • Building Real World Applications using Windows Azure (Part 1 and 2) at Thursday from 9am-11:20am.  I’ll walkthrough building a real world application using Windows Azure and discuss patterns and best practice techniques for building real world apps along the way.  The content is intermediate/advanced level (my goal is to melt your brain by the end) but doesn’t assume prior knowledge of Windows Azure.

Hope to see some of you there!

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Windows Azure: General Availability Release of BizTalk Services, Traffic Manager, Azure AD App Access + Xamarin support for Mobile Services

This morning we released another great set of enhancements to Windows Azure.  Today’s new capabilities include:

  • BizTalk Services: General Availability Release
  • Traffic Manager: General Availability Release
  • Active Directory: General Availability Release of Application Access Support
  • Mobile Services: Active Directory Support, Xamarin support for iOS and Android with C#, Optimistic concurrency
  • Notification Hubs: Price Reduction + Debug Send Support
  • Web Sites: Diagnostics Support for Automatic Logging to Blob Storage
  • Storage: Support for alerting based on storage metrics
  • Monitoring: Preview release of Windows Azure Monitoring Service Library

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

BizTalk Services: General Availability Release

I’m excited to announce the general availability release of Windows Azure Biz Talk Services.  This release is now live in production, backed by an enterprise SLA, supported by Microsoft Support, and is ready to use for production scenarios.

Windows Azure BizTalk Services enables powerful business scenarios like supply chain and cloud-based electronic data interchange and enterprise application integration, all with a familiar toolset and enterprise grade reliability.  It provides built-in support for managing EDI relationships between partners, as well as setting up EAI bridges with on-premises assets – including built-in support for integrating with on-premises SAP, SQL Server, Oracle and Siebel systems.  You can also optionally integrate Windows Azure BizTalk Services with on-premises BizTalk Server deployments – enabling powerful hybrid enterprise solutions. 

Creating a BizTalk Service

Creating a new BizTalk Service is easy – simply choose New->App Services->BizTalk Service to create a new BizTalk Service instance:

image

Windows Azure will then provision a new high-availability BizTalk instance for you to use:

image

Each BizTalk Service instance runs in a dedicated per tenant environment. Once provisioned you can use it to integrate your business better with your supply chain, enable EDI interactions with partners, and extend your on-premises systems to the cloud to facilitate EAI integration.

Changes between Preview and GA

The team has been working extremely hard in preparing Windows Azure BizTalk Services for General Availability.  In addition to finalizing the quality, we also made a number of feature improvements to address customer feedback during the preview.  These improvements include:

  • B2B and EDI capabilities are now available even in the Basic and Standard tiers (in the preview they were only in the Premium tier)
  • Significantly simplified provisioning process – ACS namespace and self-signed certificates are now automatically created for you
  • Support for worldwide deployment in Windows Azure regions
  • Multiple authentication IDs & multiple deployments are now supported in the BizTalk portal.
  • BackUp-Restore is now supported to enable Business Continuity 

If you are already using BizTalk Services in preview, you will be transitioned automatically to the GA service and new pricing will take effect on January 1, 2014.

Getting Started

Read this article to get started with provisioning your first BizTalk Service.  BizTalk Services supports a Developer Tier that enables you to do full development and testing of your EDI and EAI workloads at a very inexpensive rate. To learn more about the services and new pricing, read the BizTalk Services documentation.

Traffic Manager: General Availability Release

I’m excited to announce that Windows Azure Traffic Manager is also now generally available.  This release is now live in production, backed by an enterprise SLA, supported by Microsoft Support, and is ready to use for production scenarios.

Windows Azure Traffic Manager allows you to control the distribution of user traffic to applications that you host within Windows Azure. Your applications can run in the same data center, or be distributed across different regions across the world.  Traffic Manager works by applying an intelligent routing policy engine to the Domain Name Service (DNS) queries on your domain names, and maps the DNS routes to the appropriate instances of your applications.

You can use Traffic Manager to improve application availability - by enabling automatic customer traffic fail-over scenarios in the event of issues with one of your application instances.  You can also use Traffic Manager to improve application performance - by automatically routing your customers to the closet application instance nearest them (e.g. you can setup Traffic Manager to route customers in Europe to a European instance of your app, and customers in North America to a US instance of your app).

Getting Started

Setting up Traffic Manager is easy to do.  Simply choose New->Network Services->Traffic Manager within the Windows Azure Management Portal:

image 

When you create a Windows Azure Traffic Manager you can specify a “load balancing method” – this indicates the default traffic routing policy engine you want to use. Above I selected the “failover” policy. 

image

Once your Traffic Manager instance is created you can click the “endpoints” tab to select application or service endpoints you want the traffic manager to route traffic to.  Below I’ve added two virtual machine deployments – one in Europe and one in the United States:

image

Enabling High Availability

Traffic Manager monitors the health of each application/service endpoint configured within it, and automatically re-directs traffic to other application/service endpoints should any service fail.

In the following example, Traffic Manager is configured in a ‘Failover’ policy, which means by default all traffic is sent to the first endpoint (scottgudemo11), but if that app instance is down or having problems (as it is below) then traffic is automatically redirected to the next endpoint (scottgudemo12):

image

Traffic Manager allows you to configure the protocol, port and monitoring path used to monitor endpoint health. You can use any of your web pages as the monitoring path, or you can use a dedicated monitoring page, which allows you to implement your own customer health check logic:

image

Enabling Improved Performance

You can deploy multiple instances of your application or service in different geographic regions, and use Traffic Manager’s ‘Performance’ load-balancing policy to automatically direct end users to the closest instance of your application. This improves performance for a end user by reducing the network latency they experience:

image

In the traffic manager instance we created earlier, we had a VM deployment in both West Europe and the West US regions of Windows Azure:

image

This means that when a customer in Europe accesses our application, they will automatically be routed to the West Europe application instance.  When a customer in North America accesses our application, they will automatically be routed to the West US application instance. 

Note that endpoint monitoring and failover is a feature of all Traffic Manager load-balancing policies, not just the ‘failover’ policy.  This means that if one of the above instances has a problem and goes offline, the traffic manager will automatically direct all users to the healthy instance.

Seamless application updates

You can also explicitly enable and disable each of your application/service endpoints in Traffic Manager.  To do this simply select the endpoint, and click the Disable command:

image

This doesn’t stop the underlying application - it just tells Traffic Manager to route traffic elsewhere. This enables you to migrate traffic away from a particular deployment of an application/service whilst it is being updated and tested and then bring the service back into rotation, all with just a couple of clicks.

General Availability

As Traffic Manager plays a key role in enabling high availability applications, it is of course vital that Traffic Manager itself is highly available. That’s why, as part of general availability, we’re announcing a 99.99% uptime SLA for Traffic Manager

Traffic Manager has been available free of charge during preview. Free promotional pricing will remain in effect until December 31, 2013. Starting January 1, 2014, the following pricing will apply:

  • $0.75 per million DNS queries (reducing to $0.375 after 1 billion queries)
  • $0.50 per service endpoint/month.

Full pricing details are available on the Windows Azure Web Site.  Additional details on Traffic Manager, including a detailed description of endpoint monitoring, all configuration options, and the Traffic Manager management REST APIs, are available on MSDN.

Active Directory: General Availability of Application Access

This summer we released the initial preview of our Application Access Enhancements for Windows Azure Active Directory, which enables you to securely implement single-sign-on (SSO) support against SaaS applications as well as LOB based applications. Since then we’ve added SSO support for more than 500 applications (including popular apps like Office 365, SalesForce.com, Box, Google Apps, Concur, Workday, DropBox, GitHub, etc).

Building upon the enhancements we delivered last month, with this week’s release we are excited to announce the general availability release of the application access functionality within Windows Azure Active Directory. These features are available for all Windows Azure Active Directory customers, at no additional charge, as of today’s release:

  • SSO to every SaaS app we integrate with
  • Application access assignment and removal
  • User provisioning and de-provisioning support
  • Three built-in security reports
  • Management portal support

Every customer can now use the application access features in the Active Directory extension within the Windows Azure Management Portal.

Getting Started

To integrate your active directory with either a SaaS or LOB application, navigate to the “Applications” tab of the Directory within the Windows Azure Management Portal and click the “Add” button:

image

Clicking the “Add” button will bring up a dialog that allows you to select whether you want to add a LOB application or a SaaS application:

image

Clicking the second link will bring up a gallery of 500+ popular SaaS applications that you can easily integrate your directory with:

image

Choose an application you wish to enable SSO with and then click the OK button.  This will register the application with your directory:

image

You can then quickly walkthrough setting up single-sign-on support, and enable your Active Directory to automatically provision accounts with the SaaS application.  This will enable employees who are members of your Active Directory to easily sign-into the SaaS application using their corporate/active directory account. 

In addition to making it more convenient for the employee to sign-into the app (one less username/password to remember), this SSO support also makes the company’s data even more secure.  If the employee ever leaves the company, and their active directory account is suspended/deleted, they will lose all access to the SaaS application.  The IT administrator of the Active Directory can also optionally choose to enable the Multi-Factor Authentication support that we shipped in September to require employees to use a second-form of authentication when logging into the SaaS application (e.g. a phone app or SMS challenge) to enable even more secure identity access.  The Windows Azure Multi-Factor Authentication Service composes really nice with the SaaS support we are shipping today – you can literally set up secure support for any SaaS application (complete with multi-factor authentication support) to your entire enterprise within minutes.

You can learn more about what we’re providing with Azure Directory here, and you can ask questions and provide feedback on today’s release in the Windows Azure AD Forum.

Mobile Services: Active Directory integration, Xamarin support, Optimistic concurrency

Enterprises are increasingly going mobile to deliver their line of business apps. Today we are introducing a number of exciting updates to Mobile Services that make it even easier to build mobile LOB apps.

Preview of Windows Azure Active Directory integration with Mobile Services

I am excited to announce the preview of Widows Azure Active Directory support in Mobile Services.  Using this support, mobile business applications can now use the same easy Mobile Services authentication experience to allow employees to sign into their mobile applications with their corporate Active Directory credentials.  

With this feature, Windows Azure Active Directory becomes supported as an identity provider in Mobile Services alongside with the other identity providers we already support (which include Microsoft Accounts, Facebook ID, Google ID, and Twitter ID).  You can enable Active Directory support by clicking the “Identity” tab within a mobile service:

image

If you are an enterprise developer interested in using the Windows Azure Active Directory support in Mobile Services, please contact us at mailto:mobileservices@microsoft.com to sign-up for the private preview.

Cross-platform connected apps using Xamarin and Mobile Services

We earlier partnered with Xamarin to deliver a Mobile Services SDK that makes it easy to add capabilities such as storage, authentication and push notifications to iOS and Android applications written in C# using Xamarin. Since then, thousands of developers have downloaded the SDK and enjoyed the benefits of building cross platform mobile applications in C# with Windows Azure as their backend.  More recently as part of the Visual Studio 2013 launch, Microsoft announced a broad collaboration with Xamarin which includes Portable Class Library support for Xamarin platforms.

With today’s release we are making two additional updates to Mobile Services:

  • Delivering an updated Mobile Services Portable Class Library (PCL) SDK that includes support for both Xamarin.iOS and Xamarin.Android
  • New quickstart projects for Xamarin.iOS and Xamarin.Android exposed directly in the Windows Azure Management Portal

These updates make it even easier to build cloud connected cross-platform mobile applications.

Getting started with Xamarin and Mobile Services

If you navigate to the quickstart page for your Windows Azure Mobile Service you will see there is now a new Xamarin tab:

image

To get started with Xamarin and Windows Azure Mobile Services, all you need to do is click one of the links circled above, install the Xamarin tools, and download the Xamarin starter project that we provide directly on the quick start page above:

image

After downloading the project, unzip and open it in Visual Studio 2013. You will then be prompted to pair your instance of Visual Studio with a Mac so that you can build and run the application on iOS. See here for detailed instructions on the setup process.

Once the setup process is complete, you can select the iPhone Simulator as the target and then just hit F5 within Visual Studio to run and debug the iOS application:

image

The combination of Xamarin and Windows Azure Mobile Services make it incredibly easy to build iOS and Android applications using C# and Visual Studio.  For more information check out our tutorials and documentation.

Optimistic Concurrency Support

Today’s Mobile Services release also adds support for optimistic concurrency. With optimistic concurrency, your application can now detect and resolve conflicting updates submitted by multiple users. For example, if a user retrieves a record from a Mobile Services table to edit, and meanwhile another user updated this record in the table, without optimistic concurrency support the first user may overwrite the second user’s data update. With optimistic concurrency, conflicting changes can be caught, and your application can either provide a choice to the user to manually resolve the conflicts, or implement a resolution behavior.

When you create a new table, you will notice 3 system property columns added to support optimistic concurrency: (1) __version, which keeps the record’s version, (2) __createdAt, which is the time this record was inserted at, and (3) __updatedAt, which is the time the record was last updated.

image

You can use optimistic concurrency in your application by making two changes to your code:

First, add a version property to your data model as shown in code snippet below. Mobile Services will use this property to detect conflicts while updating the corresponding record in the table:

public class TodoItem

{

        public string Id { get; set; }

 

        [JsonProperty(PropertyName = "text")]

        public string Text { get; set; }

 

        [JsonProperty(PropertyName = "__version")]

        public byte[] Version { get; set; }

}

Second, modify your application to handle conflicts by catching the new exception MobileServicePreconditionFailedException. Mobile Services will send back this error, which includes the server version of the conflicting item. Your application can then decide on which version to commit back to the server to resolve this detected conflict.

To learn more about optimistic concurrency, review our new Mobile Services optimistic concurrency tutorial.  Also check out the new support for Custom ID property support we are also adding with today’s release – which makes it much easier to handle a variety of richer data modeling scenarios (including sharding support).

Notification Hubs: Price Reduction and Debug Send Improvements

In August I announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency to any mobile device (including Windows Phone, Windows 8, iOS and Android devices). Notification hubs can be used with any mobile app back-end (including ones built using Windows Azure Mobile Services) and can also be used with back-ends that run in the cloud as well as on-premises.

Pricing update: Removing Active Device limits from Notification Hubs paid tiers

To simplify the pricing model of Notification Hubs and pass on cost savings to our customers, we are removing the limits we previously had on the number of Active Devices allowed.  For example, the consumption price for Notification Hubs Standard Tier will now simply become $75 for 1 million pushes per month, and $199 per 5 million pushes per month (prorated daily).

These changes and price reductions will be available to all paid tiers starting Dec 15th.  More details on the pricing can be found here.

Troubleshooting Push Notifications with Debug Send

Troubleshooting push notifications can sometimes be tricky, as there are many components involved: your backend, Notification Hubs, platform notification service, and your client app.

To help with that, today’s release adds the ability to easily send test notifications directly from the Windows Azure Management portal. Simply navigate to the new DEBUG tab in every Notification Hub, specify whether you want to broadcast to all registered devices or provide a tag (or tag expression) to only target specific devices/group of devices, specify the notifications payload you wish to send, and then hit “Send”.  For example: below I am choosing to send a test notification message to all my users who have the iOS version of my app, and who have registered to subscribe to “sport-scores” within my app:

image

After the notification is sent, you will get a list of all the device registrations that were targeted by your notifications and the outcomes of their specific notifications sent as reported by the corresponding platform notification services (WNS, MPNS, APNS, and GCM). This makes it much easier to debug issues.

For help on getting started with Notification Hubs, visit the Notification Hub documentation center

Web Sites: Diagnostics Support for Automatic Logging to Blob Storage

In September we released an update to Windows Azure Web Sites that enables you to automatically persist HTTP logs to Windows Azure Blob Storage.

Today we also updated Web Sites to support persisting a Web Site’s application diagnostic logs to Blob Storage as well.  This makes it really easy to persist your diagnostics logs as text blobs that you can store indefinitely (since storage accounts can maintain huge amounts of data) and which you can also use to later perform rich data mining/analysis on them.  This also makes it much easier to quickly diagnose and understand issues you might be having within your code.

Adding Diagnostics Statements to your Code

Below is a simple example of how you can use the built-in .NET Trace API within System.Diagnostics to instrument code within a web application.  In the scenario below I’ve added a simple trace statement that logs the time it takes to call a particular method (which might call off to a remote service or database that might take awhile): 

image

Adding instrumentation code like this makes it much easier for you to quickly determine what might be the cause of a slowdown in an application in production.  By logging the performance data it also makes it possible to analyze performance trends over time (e.g. analyze what the 99th percentile latency is, etc).

Storing Diagnostics Log Files as Blobs in Windows Azure Storage

To enable diagnostic logs to be automatically written directly to blob storage, simply navigate to a Web Site using the Windows Azure Management Portal and click the CONFIGURE tab.  Then navigate to the APPLICATION DIAGNOSTICS section within it.  Starting today, you can now configure “Application Logging” to be persisted to blob storage.  To do this, just toggle the button to be “on”, and then choose the logging level you wish to persist (error, verbose, information, etc):

image

Clicking the green “manage blob storage” button brings up a dialog that allows you to configure which blob storage account you wish to store the diagnostics logs within:

image

Once you are done just click the “ok” button, and then hit “save”.  Now when your application runs, the diagnostic data will automatically be persisted to your blob storage account. 

Looking at the Application Diagnostics Data

Diagnostics logging data is persisted almost immediately as your application runs (we have a trace listener that automatically handles this within web-sites and allows you to write thousands of diagnostics messages per second).

You can use any standard tool that supports Windows Azure Blob Storage to view and download the logs.  Below I’m using the CloudXplorer tool to view my blob storage account:

image

The application diagnostic logs are persisted as .csv text files.  Windows Azure Web Sites automatically persists the files within sub-folders of the blob container that map to the year->month->day->hour of the web-site operation (which makes it easier for you to find the specific file you are looking for).

Because they are .csv text files you can open/process the log files using a wide variety of tools or custom scripts (you can even spin up a Hadoop cluster using Windows Azure HDInsight if you want to analyze lots of them quickly).  Below is a simple example of opening the above file diagnostic file using Excel:

image

Notice above how the date/time, information level, application name, webserver instance ID, eventtick, as well as proceed and thread ID were all persisted in addition to my custom message which logged the latency of the DoSomething method.

Running with Diagnostics Always On

Today’s update now makes it super easy to log your diagnostics trace messages to blob storage (in addition to the HTTP logs that were already supported).  The above steps are literally the only ones required to get started.

Because Windows Azure Storage Accounts can store 100TB each, and Windows Azure Web Sites provides an efficient way to persist the logs to it, it is now also possible to always leave diagnostics on in production and log everything you do within your application.  Having this data persisted makes it much easier for you to understand the health of your applications, debug them when there are issues, and analyze them over time to make even better.

Storage: Support for Alerting based on Storage metrics

With today’s release we have added support to enable threshold based alert rules for storage metrics. If you have enabled storage analytics metrics, you can now configure alert rules on these metrics.

You can create an alert rule on storage metrics by navigating to Management Services -> Alert tab in the Windows Azure Management Portal. Click the Add Rule button, and then in the rule creation dialog select service type as storage, select the storage account that you want to enable alerts on, followed by the storage service (blob, table, queue).

image

Then select the blob service metric and configure threshold value and email address to send the notification:

image

Once setup and enabled the alert will be listed in the alerts tab:

image

The rule will then be monitored against the storage metric. If it triggers above the configured threshold an alert email will automatically be sent.

Monitoring: Preview release of Windows Azure Monitoring Service Library

Today we are releasing a preview of our new Window Azure Monitoring Services library. This library will allow you get monitoring metrics, and programmatically configure alerts and autoscale rules for your services.

The list of monitoring services clients that we are shipping today include:

image

Let’s walk through an example of creating an alert rule using the AlertsClient library. For creating an alert rule you will need to specify the service that you are creating the alert on and the metric on which the alert rule operates. In addition, you will need to specify the rule settings for the condition and the action taken when the alert threshold is reached.  The below code shows how to programmatically do this:

image

Once the code above executes our monitoring alert rule will be configured without us ever having to manually do anything within the management portal.  You can write similar code now to retrieve operational metrics about a service and setup autoscale rules as well.  This makes it really easy to fully automate tasks.

Installing via nuget

The monitoring service library is available via nuget. Since it is still in preview form, you’ll need to add the –IncludePrerelease switch when you go to retrieve the package.

image

Documentation

The alerts, autoscale and metrics client API documentation can be accessed here.

Summary

Today’s release includes a bunch of great features that enable you to build even better cloud solutions.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni

Windows Azure: Import/Export Hard Drives, VM ACLs, Web Sockets, Remote Debugging, Continuous Delivery, New Relic, Billing Alerts and More

Two weeks ago we released a giant set of improvements to Windows Azure, as well as a significant update of the Windows Azure SDK.

This morning we released another massive set of enhancements to Windows Azure.  Today’s new capabilities include:

  • Storage: Import/Export Hard Disk Drives to your Storage Accounts
  • HDInsight: General Availability of our Hadoop Service in the cloud
  • Virtual Machines: New VM Gallery, ACL support for VIPs
  • Web Sites: WebSocket and Remote Debugging Support
  • Notification Hubs: Segmented customer push notification support with tag expressions
  • TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services
  • Developer Analytics: New Relic support for Web Sites + Mobile Services
  • Service Bus: Support for partitioned queues and topics
  • Billing: New Billing Alert Service that sends emails notifications when your bill hits a threshold you define

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them.

Storage: Import/Export Hard Disk Drives to Windows Azure

I am excited to announce the preview of our new Windows Azure Import/Export Service!

The Windows Azure Import/Export Service enables you to move large amounts of on-premises data into and out of your Windows Azure Storage accounts. It does this by enabling you to securely ship hard disk drives directly to our Windows Azure data centers. Once we receive the drives we’ll automatically transfer the data to or from your Windows Azure Storage account.  This enables you to import or export massive amounts of data more quickly and cost effectively (and not be constrained by available network bandwidth).

Encrypted Transport

Our Import/Export service provides built-in support for BitLocker disk encryption – which enables you to securely encrypt data on the hard drives before you send it, and not have to worry about it being compromised even if the disk is lost/stolen in transit (since the content on the transported hard drives is completely encrypted and you are the only one who has the key to it).  The drive preparation tool we are shipping today makes setting up bitlocker encryption on these hard drives easy.

How to Import/Export your first Hard Drive of Data

You can read our Getting Started Guide to learn more about how to begin using the import/export service.  You can create import and export jobs via the Windows Azure Management Portal as well as programmatically using our Server Management APIs.

It is really easy to create a new import or export job using the Windows Azure Management Portal.  Simply navigate to a Windows Azure storage account, and then click the new Import/Export tab now available within it (note: if you don’t have this tab make sure to sign-up for the Import/Export preview):

image

Then click the “Create Import Job” or “Create Export Job” commands at the bottom of it.  This will launch a wizard that easily walks you through the steps required:

image

For more comprehensive information about Import/Export, refer to Windows Azure Storage team blog.  You can also send questions and comments to the waimportexport@microsoft.com email address.

We think you’ll find this new service makes it much easier to move data into and out of Windows Azure, and it will dramatically cut down the network bandwidth required when working on large data migration projects.  We hope you like it.

HDInsight: 100% Compatible Hadoop Service in the Cloud

Last week we announced the general availability release of Windows Azure HDInsight. HDInsight is a 100% compatible Hadoop service that allows you to easily provision and manage Hadoop clusters for big data processing in Windows Azure.  This release is now live in production, backed by an enterprise SLA, supported 24x7 by Microsoft Support, and is ready to use for production scenarios.

HDInsight allows you to use Apache Hadoop tools, such as Pig and Hive, to process large amounts of data in Windows Azure Blob Storage. Because data is stored in Windows Azure Blob Storage, you can choose to dynamically create Hadoop clusters only when you need them, and then shut them down when they are no longer required (since you pay only for the time the Hadoop cluster instances are running this provides a super cost effective way to use them). 

You can create Hadoop clusters using either the Windows Azure Management Portal (see below) or using our PowerShell and Cross Platform Command line tools:

image

The import/export hard drive support that came out today is a perfect companion service to use with HDInsight – the combination allows you to easily ingest, process and optionally export a limitless amount of data.  We’ve also integrated HDInsight with our Business Intelligence tools, so users can leverage familiar tools like Excel in order to analyze the output of jobs. 

You can find out more about how to get started with HDInsight here.

Virtual Machines: VM Gallery Enhancements

Today’s update of Windows Azure brings with it a new Virtual Machine gallery that you can use to create new VMs in the cloud.  You can launch the gallery by doing New->Compute->Virtual Machine->From Gallery within the Windows Azure Management Portal:

image

The new Virtual Machine Gallery includes some nice enhancements that make it even easier to use:

  • Search: You can now easily search and filter images using the search box in the top-right of the dialog.  For example, simply type “SQL” and we’ll filter to show those images in the gallery that contain that substring.

  • Category Tree-view: Each month we add more built-in VM images to the gallery.  You can continue to browse these using the “All” view within the VM Gallery – or now quickly filter them using the category tree-view on the left-hand side of the dialog.  For example, by selecting “Oracle” in the tree-view you can now quickly filter to see the official Oracle supplied images.

  • MSDN and Supported checkboxes: With today’s update we are also introducing filters that makes it easy to filter out types of images that you may not be interested in. The first checkbox is MSDN: using this filter you can exclude any image that is not part of the Windows Azure benefits for MSDN subscribers (which have highly discounted pricing - you can learn more about the MSDN pricing here). The second checkbox is Supported: this filter will exclude any image that contains prerelease software, so you can feel confident that the software you choose to deploy is fully supported by Windows Azure and our partners.

  • Sort options: We sort gallery images by what we think customers are most interested in, but sometimes you might want to sort using different views. So we’re providing some additional sort options, like “Newest,” to customize the image list for what suits you best.

  • Pricing information: We now provide additional pricing information about images and options on how to cost effectively run them directly within the VM Gallery.

The above improvements make it even easier to use the VM Gallery and quickly create launch and run Virtual Machines in the cloud.

Virtual Machines: ACL Support for VIPs

A few months ago we exposed the ability to configure Access Control Lists (ACLs) for Virtual Machines using Windows PowerShell cmdlets and our Service Management API. With today’s release, you can now configure VM ACLs using the Windows Azure Management Portal as well. You can now do this by clicking the new Manage ACL command in the Endpoints tab of a virtual machine instance:

image

This will enable you to configure an ordered list of permit and deny rules to scope the traffic that can access your VM’s network endpoints. For example, if you were on a virtual network, you could limit RDP access to a Windows Azure virtual machine to only a few computers attached to your enterprise. Or if you weren’t on a virtual network you could alternatively limit traffic from public IPs that can access your workloads:

image

Here is the default behaviors for ACLs in Windows Azure:

  • By default (i.e. no rules specified), all traffic is permitted.
  • When using only Permit rules, all other traffic is denied.
  • When using only Deny rules, all other traffic is permitted.
  • When there is a combination of Permit and Deny rules, all other traffic is denied.

Lastly, remember that configuring endpoints does not automatically configure them within the VM if it also has firewall rules enabled at the OS level.  So if you create an endpoint using the Windows Azure Management Portal, Windows PowerShell, or REST API, be sure to also configure your guest VM firewall appropriately as well.

Web Sites: Web Sockets Support

With today’s release you can now use Web Sockets with Windows Azure Web Sites.  This feature enables you to easily integrate real-time communication scenarios within your web based applications, and is available at no extra charge (it even works with the free tier).  Higher level programming libraries like SignalR and socket.io are also now supported with it.

You can enable Web Sockets support on a web site by navigating to the Configure tab of a Web Site, and by toggling Web Sockets support to “on”:

image

Once Web Sockets is enabled you can start to integrate some really cool scenarios into your web applications.  Check out the new SignalR documentation hub on www.asp.net to learn more about some of the awesome scenarios you can do with it.

Web Sites: Remote Debugging Support

The Windows Azure SDK 2.2 we released two weeks ago introduced remote debugging support for Windows Azure Cloud Services. With today’s Windows Azure release we are extending this remote debugging support to also work with Windows Azure Web Sites.

With live, remote debugging support inside of Visual Studio, you are able to have more visibility than ever before into how your code is operating live in Windows Azure. It is now super easy to attach the debugger and quickly see what is going on with your application in the cloud.

Remote Debugging of a Windows Azure Web Site using VS 2013

Enabling the remote debugging of a Windows Azure Web Site using VS 2013 is really easy.  Start by opening up your web application’s project within Visual Studio. Then navigate to the “Server Explorer” tab within Visual Studio, and click on the deployed web-site you want to debug that is running within Windows Azure using the Windows Azure->Web Sites node in the Server Explorer.  Then right-click and choose the “Attach Debugger” option on it:

image

When you do this Visual Studio will remotely attach the debugger to the Web Site running within Windows Azure.  The debugger will then stop the web site’s execution when it hits any break points that you have set within your web application’s project inside Visual Studio.  For example, below I set a breakpoint on the “ViewBag.Message” assignment statement within the HomeController of the standard ASP.NET MVC project template.  When I hit refresh on the “About” page of the web site within the browser, the breakpoint was triggered and I am now able to debug the app remotely using Visual Studio:

image

Note above how we can debug variables (including autos/watchlist/etc), as well as use the Immediate and Command Windows. In the debug session above I used the Immediate Window to explore some of the request object state, as well as to dynamically change the ViewBag.Message property.  When we click the the “Continue” button (or press F5) the app will continue execution and the Web Site will render the content back to the browser.  This makes it super easy to debug web apps remotely.

Tips for Better Debugging

To get the best experience while debugging, we recommend publishing your site using the Debug configuration within Visual Studio’s Web Publish dialog. This will ensure that debug symbol information is uploaded to the Web Site which will enable a richer debug experience within Visual Studio.  You can find this option on the Web Publish dialog on the Settings tab:

image

When you ultimately deploy/run the application in production we recommend using the “Release” configuration setting – the release configuration is memory optimized and will provide the best production performance.  To learn more about diagnosing and debugging Windows Azure Web Sites read our new Troubleshooting Windows Azure Web Sites in Visual Studio guide.

Notification Hubs: Segmented Push Notification support with tag expressions

In August we announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency from any mobile app back-end.  Notification hubs can be used with any mobile app back-end (including ones built using our Mobile Services capability) and can also be used with back-ends that run in the cloud as well as on-premises.

Beginning with the initial release, Notification Hubs allowed developers to send personalized push notifications to both individual users as well as groups of users by interest, by associating their devices with tags representing the logical target of the notification. For example, by registering all devices of customers interested in a favorite MLB team with a corresponding tag, it is possible to broadcast one message to millions of Boston Red Sox fans and another message to millions of St. Louis Cardinals fans with a single API call respectively.

New support for using tag expressions to enable advanced customer segmentation

With today’s release we are adding support for even more advanced customer targeting.  You can now identify customers that you want to send push notifications to by defining rich tag expressions. With tag expressions, you can now not only broadcast notifications to Boston Red Sox fans, but take that segmenting a step farther and reach more granular segments. This opens up a variety of scenarios, for example:

  • Offers based on multiple preferences—e.g. send a game day vegetarian special to users tagged as both a Boston Red Sox fan AND a vegetarian
  • Push content to multiple segments in a single message—e.g. rain delay information only to users who are tagged as either a Boston Red Sox fan OR a St. Louis Cardinal fan
  • Avoid presenting subsets of a segment with irrelevant content—e.g. season ticket availability reminder to users who are tagged as a Boston Red Sox fan but NOT also a season ticket holder

To illustrate with code, consider a restaurant chain app that sends an offer related to a Red Sox vs Cardinals game for users in Boston. Devices can be tagged by your app with location tags (e.g. “Loc:Boston”) and interest tags (e.g. “Follows:RedSox”, “Follows:Cardinals”), and then a notification can be sent by your back-end to “(Follows:RedSox || Follows:Cardinals) && Loc:Boston” in order to deliver an offer to all devices in Boston that follow either the RedSox or the Cardinals.

image

This can be done directly in your server backend send logic using the code below:

var notification = new WindowsNotification(messagePayload);

hub.SendNotificationAsync(notification, "(Follows:RedSox || Follows:Cardinals) && Loc:Boston");

In your expressions you can use all Boolean operators: AND (&&), OR (||), and NOT (!).  Some other cool use cases for tag expressions that are now supported include:

  • Social: To “all my group except me” - group:id && !user:id
  • Events: Touchdown event is sent to everybody following either team or any of the players involved in the action: Followteam:A || Followteam:B || followplayer:1 || followplayer:2 …
  • Hours: Send notifications at specific times. E.g. Tag devices with time zone and when it is 12pm in Seattle send to: GMT8 && follows:thaifood
  • Versions and platforms: Send a reminder to people still using your first version for Android - version:1.0 && platform:Android

For help on getting started with Notification Hubs, visit the Notification Hub documentation center.  Then download the latest NuGet package (or use the Notification Hubs REST APIs directly) to start sending push notifications using tag expressions.  They are really powerful and enable a bunch of great new scenarios.

TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services

With today’s Windows Azure release we are making it really easy to enable continuous delivery support with Windows Azure and Team Foundation Services

Team Foundation Services is a cloud based offering from Microsoft that provides integrated source control (with both TFS and Git support), build server, test execution, collaboration tools, and agile planning support.  It makes it really easy to setup a team project (complete with automated builds and test runners) in the cloud, and it has really rich integration with Visual Studio.

With today’s Windows Azure release it is now really easy to enable continuous delivery support with both TFS and Git based repositories hosted using Team Foundation Services.  This enables a workflow where when code is checked in, built successfully on an automated build server, and all tests pass on it – I can automatically have the app deployed on Windows Azure with zero manual intervention or work required.

The below screen-shots demonstrate how to quickly setup a continuous delivery workflow to Windows Azure with a Git-based ASP.NET MVC project hosted using Team Foundation Services.

Enabling Continuous Delivery to Windows Azure with Team Foundation Services

The project I’m going to enable continuous delivery with is a simple ASP.NET MVC project whose source code I’m hosting using Team Foundation Services.  I did this by creating a “SimpleContinuousDeploymentTest” repository there using Git – and then used the new built-in Git tooling support within Visual Studio 2013 to push the source code to it.  Below is a screen-shot of the Git repository hosted within Team Foundation Services:

image

I can access the repository within Visual Studio 2013 and easily make commits with it (as well as branch, merge and do other tasks).  Using VS 2013 I can also setup automated builds to take place in the cloud using Team Foundation Services every time someone checks in code to the repository:

image

The cool thing about this is that I don’t have to buy or rent my own build server – Team Foundation Services automatically maintains its own build server farm and can automatically queue up a build for me (for free) every time someone checks in code using the above settings.  This build server (and automated testing) support now works with both TFS and Git based source control repositories.

Connecting a Team Foundation Services project to Windows Azure

Once I have a source repository hosted in Team Foundation Services with Automated Builds and Testing set up, I can then go even further and set it up so that it will be automatically deployed to Windows Azure when a source code commit is made to the repository (assuming the Build + Tests pass).  Enabling this is now really easy. 

To set this up with a Windows Azure Web Site simply use the New->Compute->Web Site->Custom Create command inside the Windows Azure Management Portal.  This will create a dialog like below.  I gave the web site a name and then made sure the “Publish from source control” checkbox was selected:

image

When we click next we’ll be prompted for the location of the source repository.  We’ll select “Team Foundation Services”:

image

Once we do this we’ll be prompted for our Team Foundation Services account that our source repository is hosted under (in this case my TFS account is “scottguthrie”):

image

When we click the “Authorize Now” button we’ll be prompted to give Windows Azure permissions to connect to the Team Foundation Services account.  Once we do this we’ll be prompted to pick the source repository we want to connect to.  Starting with today’s Windows Azure release you can now connect to both TFS and Git based source repositories.  This new support allows me to connect to the “SimpleContinuousDeploymentTest” respository we created earlier:

image

Clicking the finish button will then create the Web Site with the continuous delivery hooks setup with Team Foundation Services.  Now every time someone pushes source control to the repository in Team Foundation Services, it will kick off an automated build, run all of the unit tests in the solution , and if they pass the app will be automatically deployed to our Web Site in Windows Azure.  You can monitor the history and status of these automated deployments using the Deployments tab within the Web Site:

image

This enables a really slick continuous delivery workflow, and enables you to build and deploy apps in a really nice way.

Developer Analytics: New Relic support for Web Sites + Mobile Services

With today’s Windows Azure release we are making it really easy to enable Developer Analytics and Monitoring support with both Windows Azure Web Site and Windows Azure Mobile Services.  We are partnering with New Relic, who provide a great dev analytics and app performance monitoring offering, to enable this - and we have updated the Windows Azure Management Portal to make it really easy to configure.

Enabling New Relic with a Windows Azure Web Site

Enabling New Relic support with a Windows Azure Web Site is now really easy.  Simply navigate to the Configure tab of a Web Site and scroll down to the “developer analytics” section that is now within it:

image

Clicking the “add-on” button will display some additional UI.  If you don’t already have a New Relic subscription, you can click the “view windows azure store” button to obtain a subscription (note: New Relic has a perpetually free tier so you can enable it even without paying anything):

image

Clicking the “view windows azure store” button will launch the integrated Windows Azure Store experience we have within the Windows Azure Management Portal.  You can use this to browse from a variety of great add-on services – including New Relic:

image

Select “New Relic” within the dialog above, then click the next button, and you’ll be able to choose which type of New Relic subscription you wish to purchase.  For this demo we’ll simply select the “Free Standard Version” – which does not cost anything and can be used forever: 

image

Once we’ve signed-up for our New Relic subscription and added it to our Windows Azure account, we can go back to the Web Site’s configuration tab and choose to use the New Relic add-on with our Windows Azure Web Site.  We can do this by simply selecting it from the “add-on” dropdown (it is automatically populated within it once we have a New Relic subscription in our account):

image

Clicking the “Save” button will then cause the Windows Azure Management Portal to automatically populate all of the needed New Relic configuration settings to our Web Site:

image

Deploying the New Relic Agent as part of a Web Site

The final step to enable developer analytics using New Relic is to add the New Relic runtime agent to our web app.  We can do this within Visual Studio by right-clicking on our web project and selecting the “Manage NuGet Packages” context menu:

image

This will bring up the NuGet package manager.  You can search for “New Relic” within it to find the New Relic agent.  Note that there is both a 32-bit and 64-bit edition of it – make sure to install the version that matches how your Web Site is running within Windows Azure (note: you can configure your Web Site to run in either 32-bit or 64-bit mode using the Web Site’s “Configuration” tab within the Windows Azure Management Portal):

image

Once we install the NuGet package we are all set to go.  We’ll simply re-publish the web site again to Windows Azure and New Relic will now automatically start monitoring the application

Monitoring a Web Site using New Relic

Now that the application has developer analytics support with New Relic enabled, we can launch the New Relic monitoring portal to start monitoring the health of it.  We can do this by clicking on the “Add Ons” tab in the left-hand side of the Windows Azure Management Portal.  Then select the New Relic add-on we signed-up for within it.  The Windows Azure Management Portal will provide some default information about the add-on when we do this.  Clicking the “Manage” button in the tray at the bottom will launch a new browser tab and single-sign us into the New Relic monitoring portal associated with our account:

image

When we do this a new browser tab will launch with the New Relic admin tool loaded within it:

image

We can now see insights into how our app is performing – without having to have written a single line of monitoring code.  The New Relic service provides a ton of great built-in monitoring features allowing us to quickly see:

  • Performance times (including browser rendering speed) for the overall site and individual pages.  You can optionally set alert thresholds to trigger if the speed does not meet a threshold you specify.
  • Information about where in the world your customers are hitting the site from (and how performance varies by region)
  • Details on the latency performance of external services your web apps are using (for example: SQL, Storage, Twitter, etc)
  • Error information including call stack details for exceptions that have occurred at runtime
  • SQL Server profiling information – including which queries executed against your database and what their performance was
  • And a whole bunch more…

The cool thing about New Relic is that you don’t need to write monitoring code within your application to get all of the above reports (plus a lot more).  The New Relic agent automatically enables the CLR profiler within applications and automatically captures the information necessary to identify these.  This makes it super easy to get started and immediately have a rich developer analytics view for your solutions with very little effort.

If you haven’t tried New Relic out yet with Windows Azure I recommend you do so – I think you’ll find it helps you build even better cloud applications.  Following the above steps will help you get started and deliver you a really good application monitoring solution in only minutes.

Service Bus: Support for partitioned queues and topics

With today’s release, we are enabling support within Service Bus for partitioned queues and topics. Enabling partitioning enables you to achieve a higher message throughput and better availability from your queues and topics. Higher message throughput is achieved by implementing multiple message brokers for each partitioned queue and topic.  The  multiple messaging stores will also provide higher availability.

You can create a partitioned queue or topic by simply checking the Enable Partitioning option in the custom create wizard for a Queue or Topic:

image

Read this article to learn more about partitioned queues and topics and how to take advantage of them today.

Billing: New Billing Alert Service

Today’s Windows Azure update enables a new Billing Alert Service Preview that enables you to get proactive email notifications when your Windows Azure bill goes above a certain monetary threshold that you configure.  This makes it easier to manage your bill and avoid potential surprises at the end of the month.

With the Billing Alert Service Preview, you can now create email alerts to monitor and manage your monetary credits or your current bill total.  To set up an alert first sign-up for the free Billing Alert Service Preview.  Then visit the account management page, click on a subscription you have setup, and then navigate to the new Alerts tab that is available:

image

The alerts tab allows you to setup email alerts that will be sent automatically once a certain threshold is hit.  For example, by clicking the “add alert” button above I can setup a rule to send myself email anytime my Windows Azure bill goes above $100 for the month:

image

The Billing Alert Service will evolve to support additional aspects of your bill as well as support multiple forms of alerts such as SMS.  Try out the new Billing Alert Service Preview today and give us feedback.

Summary

Today’s Windows Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

omni