This is the first of a blog series which presents success stories from customers with Azure Backup. Here we discuss how Azure Backup helped Russell Reynolds
Russell Reynolds is a global leadership and executive search firm which helps their clients with assessment, executive search, and leadership transitions within boards of directors, chief executive officers, and other key roles within the C-suite. Having moved to Azure to reduce their IT and datacenter costs, the company started to look for an alternative to their tape backups which was proving both cumbersome and expensive. Enter Azure Backup.
How Azure Backup helped
With Microsoft System Center 2012 R2 Data Protection Manager they backup their VMWare workloads locally and to Azure cloud where they can be retained up to 99 years eliminating their needs for tapes. They used the Azure Backup Offline Seeding capability to copy their initial 10 TB of data to cloud. Thereafter, Azure Backup transfers only incremental data during daily backups, reducing storage consumption and need for huge bandwidth.
“Even though we used very reputable partners for tape handling, it always made us nervous when our data left our facilities”, says David W.Pfister, Director of Global Distibuted Infrastructure and Client
A few weeks back, we announced the public preview release of the new browser-based V2 UI experience for Azure Data Factory. We’ve since partnered with Pragmatic Works, who have been long-time experts in the Microsoft data integration and ETL space, to create a new set of hands on labs that you can now use to learn how to build those DI patterns using ADF V2.
In that repo, you will find data files and scripts in the Deployment folder. There are also lab manual folders for each lab module as well an overview presentation to walk you through the labs. Below you will find more details on each module.
The repo also includes a series of PowerShell and database scripts as well as Azure ARM templates that will generate resource groups that the labs need in order for you to successfully build out an end-to-end scenario, including some sample data that you can use for Power BI reports in the final Lab Module 9.
Here is how the individual labs are divided:
Lab 1 – Setting up ADF and Resources, Start here to get all of the ARM resource groups and database backup files loaded properly. Lab 2 – Lift
We heard from several customers that you need a way to view your Azure Security Center alerts in your SIEM solution for a centralized view of your security posture across your organization. Today, we are excited to announce the public preview of a new feature called SIEM Export that allows you to export Azure Security Center alerts into popular SIEM solutions such as Splunk and IBM QRadar. We are continuing to invest in the number of partners we support. This feature is part of our on-going commitment to provide unified security management and protection for your cloud and on-premises workloads.
Security Center uses a variety of detection capabilities to alert you of potential threats to your environment. The alerts can tell you what triggered the alert, what in your environment was targeted, the source of the attack, and if necessary, remediation steps. You also have the flexibility to set up custom alerts to address specific needs in your environment.
Now you can take these alerts from Security Center and integrate them into your own SIEM solutions, so you can quickly view what needs your attention from one management place and take action.
To move your Azure Security Center alerts to a
Today we’re pleased to announce three new Time Series Insights (TSI) explorer capabilities that we think our users are going to love.
First, we are delighted to share that the TSI explorer, the visualization service of TSI, is now generally available and backed by our SLA. Second, we’ve made the TSI explorer more accessible and easier to use for those with visual and fine-motor disabilities. And finally, we’ve made it easy to export aggregate event data to other analytics tools like Microsoft Excel.
Now that the TSI explorer is generally available, users will notice that the explorer is backed by TSI’s service level agreement (SLA), and we’ve removed the preview moniker from the backsplash when the explorer is loading. We have many customers using TSI in production environments and we’re thrilled to offer them the same SLA that backs the rest of the product. The ActionPoint IoT-PREDICT solution is a great example of one of those customers using the TSI explorer to enable their customers to explore and analyze time series data quickly. Check out their solution below.
There are no limits to what people can achieve when technology reflects the diversity of everyone who uses it. Transparency, accountability, and
Microsoft recently became one of the first global cloud service providers to achieve full accreditation by the Ministry of Electronics and Information Technology (MeitY) for the Government of India. MeitY lists accredited cloud service providers in the government Cloud Services Directory, which enables public sector organizations to compare and procure those services.
Making government services available to citizens online is a key part of the Digital India programme which aims to “transform India into a digitally empowered society and knowledge economy.” With the MeitY accreditation, referred to by MeitY as empanelment, Microsoft is now positioned to fully partner with India’s public sector organizations as they move to reshape India’s economic landscape.
Through Microsoft Azure, public sector organizations in India can now draw on a wide range of deployment models and service offerings, including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Disaster Recovery as a Service, Dev/Test, Virtual Desktop as a Service, and Managed Backup.
The MeitY accreditation was the result of a rigorous audit process conducted by the Standardization Testing and Quality Certification (STQC) Directorate, a government organization that provides quality assurance services. The evaluation framework is based on the work of the Meghraj Cloud Initiative, established
Azure Cosmos DB provides a scalable database solution that can handle both batch and real-time ingestion and querying and enables developers to implement lambda architectures with low TCO. Lambda architectures enable efficient data processing of massive data sets. Lambda architectures use batch-processing, stream-processing, and a serving layer to minimize the latency involved in querying big data.
To implement a lambda architecture, you can use a combination of the following technologies to accelerate real-time big data analytics:
Azure Cosmos DB, the industry’s first globally distributed, multi-model database service. Apache Spark for Azure HDInsight, a processing framework that runs large-scale data analytics applications Azure Cosmos DB change feed, which streams new data to the batch layer for HDInsight to process The Spark to Azure Cosmos DB Connector
We wrote a detailed article that describes the fundamentals of a lambda architecture based on the original multi-layer design and the benefits of a “rearchitected” lambda architecture that simplifies operations.
What is a lambda architecture?
The basic principles of a lambda architecture are depicted in the figure above:
All data is pushed into both the batch layer and speed layer. The batch layer has a master dataset (immutable, append-only set of raw data)
This post is authored by Xiaoyong Zhu, Program Manager, Max Kaznady, Senior Data Scientist, and Gilbert Hendry, Senior Data Scientist, at Microsoft.
There are an increasing number of useful applications of machine learning and Artificial Intelligence in the domain of audio, such as in home surveillance (e.g. detecting glass breaking and alarm events); security (e.g. detecting sounds of explosions and gun shots); driverless cars (sound event detection for increased safety); predictive maintenance (forecast machine failures in a manufacturing process based on vibrations); for real-time translation in Skype and even for music synthesis.
The human brain processes such a wide variety of sounds so effortlessly – be it the bark of puppies, audible alarms from smoke or carbon monoxide detectors, or people talking loudly in a coffee shop – that most of us tend to take this faculty for granted. But what if we could apply AI to help the hearing impaired achieve something similar? That would be something special.
So, is AI ready to help the hearing impaired understand and react to the world around them?
Today, we are excited to announce the general availability of several features in Azure Stream Analytics. These features are designed to help address a variety of scenarios for both enterprise and non-enterprise customers alike. These include:
Sub-streams support: A variety of streaming applications that customers build using Azure Stream Analytics such as IoT, connected car and automotive telematics, smart elevators, etc requires processing of telemetry streams from each asset or source by itself – all in the same job without merging the timeline from events belonging to different devices or sources. This is because their clocks may be drastically out of sync. The new sub-streams support in Stream Analytics will offer this powerful new capability with very simple language constructs. A new keyword OVER is being introduced to extend the TIMESTAMP BY clause for this purpose. More details can be found at the TIMESTAMP documentation page.
Process data from different senders (toll stations) without applying time policies across different senders. The input data is partitioned based on TollId.
Egress to Azure functions: Azure Functions is a serverless compute service in Azure that helps users run their custom code triggered by events occurring in Azure or third-party services. This ability
This GA release brings forward a few significant changes:
We have split the Storage SDKs into four packages, one each for Blob, Table, Queue, and File. As announced, this was done in order to reduce the footprint of the libraries and allow developers to consume only the packages they are interested in. Support is now available for newer Azure Storage REST API versions and service features. See below for details on each SDK. Azure Storage SDK for Python
Storage SDK packages for Blob, File, and Queue in Python are available on PyPi with version 1.0. This release supports the April 4, 2017 REST API version, bringing support for archival storage and blob tiering. Table package is released under the name Azure-Cosmosdb-table.
Here is a Hello World sample with the Storage SDK for Python:
from azure.storage.blob import BlockBlobService import os # Create a blob service client block_blob_service = BlockBlobService(os.environ.get(‘AZURE_STORAGE_ACCOUNT_NAME’), os.environ.get(‘AZURE_STORAGE_ACCOUNT_KEY’)) # upload a blob from text block_blob_service.create_blob_from_text( ‘mycontainer’, ‘myblockblob’, ‘Hello World!’ ) # download a blob into a buffer blob
Modern applications are taking maximum advantage of the agility and flexibility of the cloud by moving away from monolithic architectures and instead using a set of distinct services, all working together. This includes foundational services offered by a cloud platform like Azure (Database, Storage, IoT, Compute, Serverless Functions, etc.) and application-specific services (inventory management, payment services, manufacturing processes, mobile experiences, etc.). In these new architectures, event-driven execution has become a foundational cornerstone. It replaces cumbersome polling for communication between services with a simple mechanism. These events could include IoT device signals, cloud provisioning notifications, storage blob events, or even custom scenarios such as new employees being added to HR systems. Reacting to such events efficiently and reliably is critical in these new app paradigms.
Today, I am excited to announce the general availability of Azure Event Grid, a fully managed event routing service that simplifies the development of event-based applications.
Azure Event Grid is the first of its kind, enabling applications and services to subscribe to all the events they need to handle whether they come from Azure services or from other parts of the same application. These events are delivered through push semantics, simplifying your code and reducing your