01

Feb

Virtual Network Service Endpoints and Firewalls for Azure Storage now generally available

This blog post was co-authored by Anitha Adusumilli, Principal Program Manager, Azure Networking.

Today we are announcing the general availability of Firewalls and Virtual Networks (VNets) for Azure Storage along with Virtual Network Service Endpoints. Azure Storage Firewalls and Virtual Networks uses Virtual Network Service Endpoints to allow administrators to create network rules that allow traffic only from selected VNets and subnets, creating a secure network boundary for their data. These features are now available in all Azure public cloud regions and Azure Government. As part of moving to general availability it is now backed by the standard SLAs. There is no additional billing for virtual network access through service endpoints. The current pricing model for Azure Storage applies as is today.

Customers often prefer multiple layers of security to help protect their data. This includes network-based access control protections as well as authentication and authorization-based protections. As part of the general availability of Firewalls and Virtual Networks for Storage and VNet Service Endpoints we enable network-based access control. These new network focused features allow the customer to define network access-based security ensuring that only requests coming from approved Azure VNets or specified public IP ranges will be allowed to

01

Feb

A great developer experience for Ansible
A great developer experience for Ansible

As customers grow their deployed application in Azure, we are seeing increased interest in DevOps space for configuration management. In the rapidly evolving cloud space, bringing the on-premises expertise to fluently work in cloud brings increased efficiency. With our strong and growing partnership with Redhat, I am extremely excited to announce some key improvements with developer experience of Ansible on Azure.

Ansible in Azure Cloud Shell

Ansible is now available, pre-installed and ready to use for every Azure user in the Azure Cloud Shell. We want to make it really easy for anyone to get started with Ansible. The Azure Cloud Shell is a browser-based command-line experience that enables Ansible commands to be executed directly in the portal. This shell can run on any machine and any browser. It even runs on your phone!

With this enhancement you can use Ansible right in the Azure Portal. There is no need to install python dependencies, there is no additional configuration and no additional authentication! It just works!

Ansible extension in Visual Studio Code

We also have released an Ansible extension for Visual Studio Code that allows for faster development and testing of Ansible playbooks. You can use this extension to

01

Feb

How is AI for video different from AI for images

Extracting insights from video, or using AI technologies, presents an additional set of challenges and opportunities for optimization as compared to images. There is a misconception that AI for video is simply extracting frames from a video and running computer vision algorithms on each video frame. While you can certainly do that but that would not help you get the insights that you are truly after. In this blog post, I will use a few examples to explain the shortcomings of taking an approach of just processing individual video frames. I will not be going over the details of the additional algorithms that are required to overcome these shortcomings. Video Indexer implements several such video specific algorithms.

Person presence in the video

Look at the first 25 seconds of this video.

Notice that Doug is present for the entire 25 seconds.

If I were to draw a timeline for when Doug is present in the video, it should be something like this.

 

Note the fact that Doug is not always facing the camera. Seven seconds in the video he is looking at Emily. Same thing happens at 23 seconds.

If you were to run face detection at

31

Jan

Jenkins on Azure: from zero to hero
Jenkins on Azure: from zero to hero

We are excited to announce a refresh for the Microsoft Jenkins offer in Azure Marketplace.

Like the previous version, this offer allows customers to run a Jenkins master on a Linux (Ubuntu 16.04 LTS) VM in Azure. The price is the cost of running the software components and Azure infrastructure deployed by the solution template. If you are looking to run Jenkins in the cloud, you will have full control over the Jenkins master you set up.

So why are we so excited about this refresh? Because now you can go from zero to hero. Just set up the server with the configurations you need and start building in the least amount of time.

Some highlights:

Virtual network: We added support for VNET so that you can provision Jenkins in your own virtual network/subnet.

Azure integration: You can choose to enable Managed Service Identity (MSI) or supply an Azure Service Principal. We add the credential in Jenkins credential store automatically so that you don’t have to do this manually. Choose off if you prefer to set this up later. Build: By enabling VM agent or Azure Container Instances (ACI), you can start building your projects in Azure right away. We create the default

31

Jan

Customer success stories with Azure Backup: Russell Reynolds

This is the first of a blog series which presents success stories from customers with Azure Backup. Here we discuss how Azure Backup helped Russell Reynolds

Customer background

Russell Reynolds is a global leadership and executive search firm which helps their clients with assessment, executive search, and leadership transitions within boards of directors, chief executive officers, and other key roles within the C-suite. Having moved to Azure to reduce their IT and datacenter costs, the company started to look for an alternative to their tape backups which was proving both cumbersome and expensive. Enter Azure Backup.

How Azure Backup helped

With Microsoft System Center 2012 R2 Data Protection Manager they backup their VMWare workloads locally and to Azure cloud where they can be retained up to 99 years eliminating their needs for tapes. They used the Azure Backup Offline Seeding capability to copy their initial 10 TB of data to cloud. Thereafter, Azure Backup transfers only incremental data during daily backups, reducing storage consumption and need for huge bandwidth.

“Even though we used very reputable partners for tape handling, it always made us nervous when our data left our facilities”, says David W.Pfister, Director of Global Distibuted Infrastructure and Client

31

Jan

New Azure Data Factory self-paced hands-on lab for UI

A few weeks back, we announced the public preview release of the new browser-based V2 UI experience for Azure Data Factory. We’ve since partnered with Pragmatic Works, who have been long-time experts in the Microsoft data integration and ETL space, to create a new set of hands on labs that you can now use to learn how to build those DI patterns using ADF V2.

In that repo, you will find data files and scripts in the Deployment folder. There are also lab manual folders for each lab module as well an overview presentation to walk you through the labs. Below you will find more details on each module.

The repo also includes a series of PowerShell and database scripts as well as Azure ARM templates that will generate resource groups that the labs need in order for you to successfully build out an end-to-end scenario, including some sample data that you can use for Power BI reports in the final Lab Module 9.

Here is how the individual labs are divided:

Lab 1 – Setting up ADF and Resources, Start here to get all of the ARM resource groups and database backup files loaded properly. Lab 2 – Lift

31

Jan

Integrate Azure Security Center alerts into SIEM solutions

We heard from several customers that you need a way to view your Azure Security Center alerts in your SIEM solution for a centralized view of your security posture across your organization. Today, we are excited to announce the public preview of a new feature called SIEM Export that allows you to export Azure Security Center alerts into popular SIEM solutions such as Splunk and IBM QRadar. We are continuing to invest in the number of partners we support. This feature is part of our on-going commitment to provide unified security management and protection for your cloud and on-premises workloads.

Security Center uses a variety of detection capabilities to alert you of potential threats to your environment. The alerts can tell you what triggered the alert, what in your environment was targeted, the source of the attack, and if necessary, remediation steps. You also have the flexibility to set up custom alerts to address specific needs in your environment.

Now you can take these alerts from Security Center and integrate them into your own SIEM solutions, so you can quickly view what needs your attention from one management place and take action.

To move your Azure Security Center alerts to a

31

Jan

Three new reasons to love the TSI explorer
Three new reasons to love the TSI explorer

Today we’re pleased to announce three new Time Series Insights (TSI) explorer capabilities that we think our users are going to love. 

First, we are delighted to share that the TSI explorer, the visualization service of TSI, is now generally available and backed by our SLA.  Second, we’ve made the TSI explorer more accessible and easier to use for those with visual and fine-motor disabilities. And finally, we’ve made it easy to export aggregate event data to other analytics tools like Microsoft Excel. 

Now that the TSI explorer is generally available, users will notice that the explorer is backed by TSI’s service level agreement (SLA), and we’ve removed the preview moniker from the backsplash when the explorer is loading. We have many customers using TSI in production environments and we’re thrilled to offer them the same SLA that backs the rest of the product. The ActionPoint IoT-PREDICT solution is a great example of one of those customers using the TSI explorer to enable their customers to explore and analyze time series data quickly. Check out their solution below.

There are no limits to what people can achieve when technology reflects the diversity of everyone who uses it. Transparency, accountability, and

31

Jan

Full MeitY accreditation enables Indian public sector to deploy on Azure

Microsoft recently became one of the first global cloud service providers to achieve full accreditation by the Ministry of Electronics and Information Technology (MeitY) for the Government of India. MeitY lists accredited cloud service providers in the government Cloud Services Directory, which enables public sector organizations to compare and procure those services.

Making government services available to citizens online is a key part of the Digital India programme which aims to “transform India into a digitally empowered society and knowledge economy.” With the MeitY accreditation, referred to by MeitY as empanelment, Microsoft is now positioned to fully partner with India’s public sector organizations as they move to reshape India’s economic landscape.

Through Microsoft Azure, public sector organizations in India can now draw on a wide range of deployment models and service offerings, including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Disaster Recovery as a Service, Dev/Test, Virtual Desktop as a Service, and Managed Backup.

The MeitY accreditation was the result of a rigorous audit process conducted by the Standardization Testing and Quality Certification (STQC) Directorate, a government organization that provides quality assurance services. The evaluation framework is based on the work of the Meghraj Cloud Initiative, established

30

Jan

Lambda Architecture using Azure #CosmosDB: Faster performance, Low TCO, Low DevOps

Azure Cosmos DB provides a scalable database solution that can handle both batch and real-time ingestion and querying and enables developers to implement lambda architectures with low TCO. Lambda architectures enable efficient data processing of massive data sets. Lambda architectures use batch-processing, stream-processing, and a serving layer to minimize the latency involved in querying big data.

To implement a lambda architecture, you can use a combination of the following technologies to accelerate real-time big data analytics:

Azure Cosmos DB, the industry’s first globally distributed, multi-model database service. Apache Spark for Azure HDInsight, a processing framework that runs large-scale data analytics applications Azure Cosmos DB change feed, which streams new data to the batch layer for HDInsight to process The Spark to Azure Cosmos DB Connector

We wrote a detailed article that describes the fundamentals of a lambda architecture based on the original multi-layer design and the benefits of a “rearchitected” lambda architecture that simplifies operations.

What is a lambda architecture?

The basic principles of a lambda architecture are depicted in the figure above:

All data is pushed into both the batch layer and speed layer. The batch layer has a master dataset (immutable, append-only set of raw data)