04

Apr

Improvements to SQL Elastic Pool configuration experience

We have made some great improvements to the SQL elastic pool configuration experience in the Azure portal. These changes are released alongside the new vCore-based purchasing model for elastic pools and single databases. Our goal is to simplify your experience configuring elastic pools and ensure you are confident in your configuration choices.

Changing service tiers for existing pools

Existing elastic pools can now be scaled up and down between service tiers. You can easily move between service tiers and discover the one that best fits your business needs. You can also switch between the DTU-based and the new vCore-based service tiers. You can also scale down your pool outside of business hours to save cost.

Simplifying configuration of the pool and its databases

Elastic pools offer many settings for customers to customize. The new experience aims to separate and simplify each aspect of pool management, between the pool settings, database settings, and database management. This enables you to more easily reason over each of these aspects of the pool while being able to save all settings changes in one batch.

Understanding your bill with new cost summary

Our new cost summary experience for elastic pools and single databases

04

Apr

SQL Database: Long-term backup retention preview includes major updates

The preview for long-term backup retention in Azure SQL Database was announced in October 2016, providing you with a way to easily manage long-term retention for your databases – up to 10 years – with backups stored in your own Azure Backup Service Vault.

Based upon feedback gathered during the preview, we are happy to announce a set of major enhancements to the long-term backup retention solution. With this update we have eliminated the need for you to deploy and manage a separate Backup Service Vault. Instead, SQL Database will utilize Azure Blob Storage under the covers to store and manage your long-term backups. This new design will enable flexibility for your backup strategy, and overall more control over costs.

This update brings you the following additional benefits:

More regional support – Long-term retention will be supported in all Azure regions and national clouds. More flexible backup policies – You can customize the frequency of long-term backups for each database with policies covering weekly, monthly, yearly, and specific week-within-a-year backups. Management of individual backups – You can delete backups that are not critical for compliance. Streamlined configuration No need to provision a separate backup service vault. What happens with

04

Apr

A flexible new way to purchase Azure SQL Database
A flexible new way to purchase Azure SQL Database

We’re excited to announce the preview of an additional purchasing model to the Azure SQL Database Elastic Pool and Single Database deployment options. Recently announced with SQL Database Managed Instance, the vCore-based model reflects our commitment to customer choice by providing flexibility, control, and transparency. As with Managed Instance, the vCore-based model makes the Elastic Pool and Single Database options eligible for up to 30 percent savings* with the Azure Hybrid Benefit for SQL Server.

Optimize flexibility and performance with two new service tiers

The new vCore-based model introduces two service tiers, General Purpose and Business Critical. These tiers let you independently define and control compute and storage configurations, and optimize them to exactly what your application requires.  If you’re considering a move to the cloud, the new model also provides a straightforward way to translate on-premises workload requirements to the cloud. General Purpose is designed for most business workloads and offers budget-oriented, balanced, and scalable compute and storage options. Business Critical is designed for business applications with high IO requirements and offers the highest resilience to failures.

Choosing between DTU and vCore-based performance levels

You want the freedom to choose what’s right for your workloads and we’re committed

03

Apr

Power BI Desktop April Feature Summary
Power BI Desktop April Feature Summary

https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-april-2018-feature-summary/Source: https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-april-2018-feature-summary/           This month, on top of our normal of visual & data connector improvements, we have two exciting features that lets report consumers interact with reports in a new way. The Q&A Explorer enables READ MORE

03

Apr

Intelligent Edge: Building a Skin Cancer Prediction App with Azure Machine Learning, CoreML & Xamarin

This post is authored by Anusua Trivedi, Carlos Pessoa, Vivek Gupta & Wee Hyong Tok from the Cloud AI Platform team at Microsoft.

Motivation

AI has emerged as one of the most disruptive forces behind digital transformation and it is revolutionizing the way we live and work. AI-powered experiences are augmenting human capabilities and transforming how we live, work, and play – and they have enormous potential in allowing us to lead healthier lives as well.

Introduction

AI is empowering clinicians with deep insights that are helping them make better decisions, and the potential to save lives and money is tremendous. At Microsoft, the Health NExT project is looking at innovative approaches to fuse research, AI and industry expertise to enable a new wave of healthcare innovations. The Microsoft AI platform empowers every developer to innovate and accelerate the development of real-time intelligent apps on edge devices. There are a couple of advantages of running intelligent real-time apps on edge devices – you get:

Lowered latency, for local decision making.
Reduced reliance on internet connectivity.

Imagine environments where there’s limited or no connectivity, whether it’s because of lack of communications infrastructure or because of the sensitivity of the

03

Apr

Introducing a new way to purchase Azure monitoring services

Today customers rely on Azure’s application, infrastructure, and network monitoring capabilities to ensure their critical workloads are always up and running. It’s exciting to see the growth of these services and that customers are using multiple monitoring services to get visibility into issues and resolve them faster. To make it even easier to adopt Azure monitoring services, today we are announcing a new consistent purchasing experience across the monitoring services. Three key attributes of this new pricing model are:

1. Consistent pay-as-you-go pricing

We are adopting a simple “pay-as-you-go” model across the complete portfolio of monitoring services. You have full control and transparency, so you pay for only what you use. 

2. Consistent per gigabyte (GB) metering for data ingestion

We are changing the pricing model for data ingestion from “per node” to “per GB”. Customers told us that the value in monitoring came from the amount of data received and the insight built on top of that, rather than the number of nodes. In addition, this new model works best for the future of containers and microservices where the definition of a node is less clear. “Per GB” data ingestion is the new basis for pricing across application, infrastructure,

03

Apr

Accelerate Data Warehouse Modernization to Azure with Informatica’s AI-Driven Platform

Data is central to digital transformation. We have seen many customers moving their data workloads to Azure which benefits from the inherent performance and agility of cloud. Enterprises are moving on-premises workloads to public cloud at an increasing rate. Results from the 2016 Harvey Nash/KPMG CIO Survey indicate that cloud adoption is now mainstream and accelerating as enterprises shift data-intensive operations to the cloud. Specifically, Platform-as-a-Service (PaaS) adoption is predicted to be the fastest-growing sector of cloud platforms according to KPMG, growing from 32 percent in 2017 to 56 percent adoption in 2020.

Cloud data warehouse is one of the fastest growing segments. Azure SQL Data Warehouse (SQL DW) allows customers to unleash the elasticity and economics of cloud while maintain a fast, flexible and secure warehouse for all their data.
Microsoft has partnered with Informatica, the leader in Enterprise Cloud Data Management, to help you modernize your data architecture with intelligent data management. So that you can build a cloud data warehouse solution that easily adapts and scales as your data types, volume, applications and architecture changes.

Informatica’s AI-driven Intelligent Data Platform, with solutions purpose-built for Azure, is a modular micro services architecture that accelerates your Azure SQL

03

Apr

Azure IoT Hub is driving down the cost of IoT

Customers rely on the Azure IoT Hub service and love the scale, performance, security, and reliability it provides for connecting billions of IoT devices sending trillions of messages. Azure IoT Hub is already powering production IoT solutions across all major market segments including retail, healthcare, automotive, manufacturing, energy, agriculture, oil and gas, life sciences, smart buildings, and many others. Today, we have a few exciting announcements to make about Azure IoT Hub.

Over the years we’ve noticed that many customers start their IoT journey by simply sending data from devices to the cloud. We refer to this as “device to cloud telemetry,” and it provides a significant benefit. We’ve also noticed that later in their IoT journey most customers realize they need the ability to send commands out to devices, i.e., “cloud to device messaging,” as well as full device management capabilities, so they can manage the software, firmware, and configuration of their devices.

At Microsoft, we believe in meeting customers where they are and providing a great experience for them to capture the benefits of IoT. Because of this, we’re excited to announce a new capability of Azure IoT Hub: a “device to cloud telemetry” tier, called the

02

Apr

Week of 4/2 two great webinars: Practical DAX for Power BI and Power BI Embedding -The April 2018 Update

https://powerbi.microsoft.com/en-us/blog/week-of-4-2-two-great-webinars-practical-dax-for-power-bi-and-power-bi-embedding-the-april-2018-update/Source: https://powerbi.microsoft.com/en-us/blog/week-of-4-2-two-great-webinars-practical-dax-for-power-bi-and-power-bi-embedding-the-april-2018-update/           This, 4/2/2018 brings you two free great Power BI Webinars: Practical DAX for Power BI with Phil Seamark and Developing with Power BI Embedding – The April 2018 Update by Ted Pattison!

02

Apr

Azure.Source – Volume 25
Azure.Source – Volume 25

One of the biggest news items out of last week was the arrival of Virtual Machine Serial Console access in public preview in all global regions. This is something customers have wanted for some time and the team was ecstatic to finally deliver it. Down below, you’ll find a blog post and episodes of Tuesdays with Corey and Azure Friday that go into all the details and show how it works.

Now in preview

Virtual Machine Serial Console access – Virtual machine serial console enables bidirectional serial console access to your virtual machines. The preview is available in global Azure regions. To try it, look for Serial console (Preview) in the Support+Troubleshooting section of your virtual machine. Serial Console access requires you to have VM Contributor or higher privileges to the virtual machine. This will ensure connection to the console is kept at the highest level of privileges to protect your system.

Preview: SQL Database Transparent Data Encryption with Azure Key Vault configuration checklist – Azure SQL Database and Data Warehouse offer encryption-at-rest by providing Transparent Data Encryption (TDE) for all data written to disk, including databases, log files and backups. The TDE protector is by default managed by the