If you are building an IoT solution in the cloud, chances are your focus is on the devices and what you can accomplish with them. You might want to process data coming from a network of devices in real time, analyze the data to gain insights, get alerted for special conditions, manage the devices themselves, and so on. What is less interesting to you is setting up and managing the infrastructure in the cloud, which will enable you to do the above. This is where serverless comes in.
Serverless technologies, like Azure Functions, take away the burden of managing infrastructure and enables you to focus on your IoT-powered business logic. IoT projects usually have variable traffic, which means accumulating infrastructure to account for peak loads, isn’t the best strategy. Adopting serverless allows your solutions to scale dynamically while keeping costs low.
This video shows a great application of a serverless architecture to receive data from a device, transform it in real time using machine learning, and send it back to the device. It is based on the DevKit Translator IoT project.
After reading this blogpost, you will be able to build your own custom email notifications for SQL Database Automatic tuning recommendations. We have listened to our customers requesting this functionality and have created a custom solution based on readily available technologies on Azure.
SQL Database performance tuning recommendations are generated by Azure SQL Database Automatic tuning. This solution provides peak performance and stable workloads through continuous database performance tuning utilizing Artificial Intelligence (AI).
Tuning recommendations are provided for each individual SQL database on Azure subscription for which Automatic tuning is enabled. Recommendations are related to index creation, index deletion, and optimization of query execution plans. Tuning recommendations are provided only in cases when AI considers them as beneficial to database performance.
Email notifications for Automatic tuning
Some of our customers have indicated a need to receive automated email notifications with suggested SQL Database Automatic tuning recommendations to be able to view and build automated alerts. For example, when the solution recommends that an index should be dropped to improve database performance, some customers would prefer to be notified of such event. Another customer scenario is, for example, emailing automated tuning recommendations to different database administrators in charge of different database
The word is out, and the industry is taking notice. Azure Cosmos DB is the world’s first globally distributed, multi-model database service with native NoSQL support. Designed for the cloud, Azure Cosmos DB enables you to build planet-scale applications that bring data to where your users are with SLA guarantees low latency, throughput, and 99.99% availability.
The experts at IDG’s InfoWorld recently recognized Azure Cosmos DB in the InfoWorld Technology of the Year Awards, zeroing in on its “innovative approach to the complexities of building and managing distributed systems,” which includes recognition for leveraging the work of Turing Award winner Leslie Lamport to deliver multiple consistency models. Azure Cosmos DB was also recognized for delivering a globally distributed system where users anywhere in the world can see the same version of data, no matter their location.
In addition, InfoWorld complimented the flexibility and variety of use cases with Azure Cosmos DB, from JSON-based document stores to support for MongoDB APIs and a SQL query option for Azure’s Table Storage.
“Do you need a distributed NoSQL database with a choice of APIs and consistency models? That would be Microsoft’s Azure Cosmos DB.”—InfoWorld, Technology of the Year 2018: The best hardware, software,
In December we announced the release of ARM APIs for usage details. We continue that transformation to ARM APIs with this release of the Marketplace charges API for Enterprise and Web Direct customers with a few exceptions documented in the limitations and a Price Sheet API for Enterprise customers.
The updated ARM API offers the benefits of:
Migrating from a key based authorization model to ARM based authentication. Resulting in an improved security posture and the ability to utilize ARM RBAC for authorization. Adding support for Web Direct subscriptions, with a few exceptions documented below. Limitations on subscriptions
The following subscription types are currently not supported with the Marketplace Charges API:
MS-AZR-0145P MS-AZR-0 146P MS-AZR-159P MS-AZR-0036P MS-AZR-0143P MS-AZR-0015P MS-AZR-0144P Support for E-Tags
Customers who are heavy users of our APIs have requested a new data indicator. This helps them avoid polling the API and getting a full response back that has not changed since the last call. To address this request, we are releasing an updated version of the legacy usage details API that will use E-Tags to let callers know when data has been refreshed. Each call to the API will return an E-Tag, in subsequent calls to the
Last month we announced a preview release of subscription level budgets for enterprise customers, that was only the first step. Today we’re announcing the release of additional features that support the scoping of more granular budgets with filters as well as support for usage and cost budgets. We’ve heard from our customers that multiple teams share a subscription and that resource groups serve as cost boundaries. Today’s updates will support resource group and resource level budgets in addition to the subscription level budgets. The budgets API is now generally available and we welcome your feedback.
The preview release of budgets only supported cost based budgets. In this release we are also adding support for usage budgets. Additionally, support for filters enables you to define the scope at which a budget applies.
Here are a few common scenarios that the budgets API addresses:
A budget for the subscription with no constraints. A resource group budget with no constraints. A budget for multiple resource groups within a subscription. A budget for multiple resources within a resource group or a subscription. Budgets based on usage on a subscription or resource group.
This enables most common scenarios where resource groups or specific resources
This blog post was co-authored by Anitha Adusumilli, Principal Program Manager, Azure Networking.
Today we are announcing the general availability of Firewalls and Virtual Networks (VNets) for Azure Storage along with Virtual Network Service Endpoints. Azure Storage Firewalls and Virtual Networks uses Virtual Network Service Endpoints to allow administrators to create network rules that allow traffic only from selected VNets and subnets, creating a secure network boundary for their data. These features are now available in all Azure public cloud regions and Azure Government. As part of moving to general availability it is now backed by the standard SLAs. There is no additional billing for virtual network access through service endpoints. The current pricing model for Azure Storage applies as is today.
Customers often prefer multiple layers of security to help protect their data. This includes network-based access control protections as well as authentication and authorization-based protections. As part of the general availability of Firewalls and Virtual Networks for Storage and VNet Service Endpoints we enable network-based access control. These new network focused features allow the customer to define network access-based security ensuring that only requests coming from approved Azure VNets or specified public IP ranges will be allowed to
As customers grow their deployed application in Azure, we are seeing increased interest in DevOps space for configuration management. In the rapidly evolving cloud space, bringing the on-premises expertise to fluently work in cloud brings increased efficiency. With our strong and growing partnership with Redhat, I am extremely excited to announce some key improvements with developer experience of Ansible on Azure.
Ansible is now available, pre-installed and ready to use for every Azure user in the Azure Cloud Shell. We want to make it really easy for anyone to get started with Ansible. The Azure Cloud Shell is a browser-based command-line experience that enables Ansible commands to be executed directly in the portal. This shell can run on any machine and any browser. It even runs on your phone!
With this enhancement you can use Ansible right in the Azure Portal. There is no need to install python dependencies, there is no additional configuration and no additional authentication! It just works!
We also have released an Ansible extension for Visual Studio Code that allows for faster development and testing of Ansible playbooks. You can use this extension to
Extracting insights from video, or using AI technologies, presents an additional set of challenges and opportunities for optimization as compared to images. There is a misconception that AI for video is simply extracting frames from a video and running computer vision algorithms on each video frame. While you can certainly do that but that would not help you get the insights that you are truly after. In this blog post, I will use a few examples to explain the shortcomings of taking an approach of just processing individual video frames. I will not be going over the details of the additional algorithms that are required to overcome these shortcomings. Video Indexer implements several such video specific algorithms.
Person presence in the video
Look at the first 25 seconds of this video.
Notice that Doug is present for the entire 25 seconds.
If I were to draw a timeline for when Doug is present in the video, it should be something like this.
Note the fact that Doug is not always facing the camera. Seven seconds in the video he is looking at Emily. Same thing happens at 23 seconds.
If you were to run face detection at
We are excited to announce a refresh for the Microsoft Jenkins offer in Azure Marketplace.
Like the previous version, this offer allows customers to run a Jenkins master on a Linux (Ubuntu 16.04 LTS) VM in Azure. The price is the cost of running the software components and Azure infrastructure deployed by the solution template. If you are looking to run Jenkins in the cloud, you will have full control over the Jenkins master you set up.
So why are we so excited about this refresh? Because now you can go from zero to hero. Just set up the server with the configurations you need and start building in the least amount of time.
Virtual network: We added support for VNET so that you can provision Jenkins in your own virtual network/subnet.
Azure integration: You can choose to enable Managed Service Identity (MSI) or supply an Azure Service Principal. We add the credential in Jenkins credential store automatically so that you don’t have to do this manually. Choose off if you prefer to set this up later. Build: By enabling VM agent or Azure Container Instances (ACI), you can start building your projects in Azure right away. We create the default
This is the first of a blog series which presents success stories from customers with Azure Backup. Here we discuss how Azure Backup helped Russell Reynolds
Russell Reynolds is a global leadership and executive search firm which helps their clients with assessment, executive search, and leadership transitions within boards of directors, chief executive officers, and other key roles within the C-suite. Having moved to Azure to reduce their IT and datacenter costs, the company started to look for an alternative to their tape backups which was proving both cumbersome and expensive. Enter Azure Backup.
How Azure Backup helped
With Microsoft System Center 2012 R2 Data Protection Manager they backup their VMWare workloads locally and to Azure cloud where they can be retained up to 99 years eliminating their needs for tapes. They used the Azure Backup Offline Seeding capability to copy their initial 10 TB of data to cloud. Thereafter, Azure Backup transfers only incremental data during daily backups, reducing storage consumption and need for huge bandwidth.
“Even though we used very reputable partners for tape handling, it always made us nervous when our data left our facilities”, says David W.Pfister, Director of Global Distibuted Infrastructure and Client