Category Archives : Storage, Backup & Recovery

02

Apr

Announcing server-side encryption with customer-managed keys for Azure Managed Disks

Today, we’re announcing the general availability for server-side encryption (SSE) with customer-managed keys (CMK) for Azure Managed Disks. Azure customers already benefit from SSE with platform-managed keys for Managed Disks enabled by default. SSE with CMK improves on platform-managed keys by giving you control of the encryption keys to meet your compliance need.

Today, customers can also use Azure Disk Encryption, which leverages the Windows BitLocker feature and the Linux dm-crypt feature to encrypt Managed Disks with CMK within the guest virtual machine (VM). SSE with CMK improves on Azure Disk encryption by enabling you to use any OS types and images, including custom images, for your VMs by encrypting data in the Azure Storage service.

SSE with CMK is integrated with Azure Key Vault, which provides highly available and scalable secure storage for your keys backed by Hardware Security Modules. You can either bring your own keys (BYOK) to your Key Vault or generate new keys in the Key Vault.

About the key management

Managed Disks are encrypted and decrypted transparently using 256-bit Advanced Encryption Standard (AES) encryption, one of the strongest block ciphers available. The Storage service handles the encryption and decryption in a fully transparent fashion using envelope

Share

19

Mar

Filesystem SDKs for Azure Data Lake Storage Gen2 now generally available

Since the general availability of Azure Data Lake Storage (ADLS) Gen2 in Feb 2019, customers have been getting insights for their big data analytics workloads at cloud scale. Integration to analytics engines is critical for their analytics workloads, and equally important is the ability to programmatically ingest, manage, and analyze data. This ability is critical for key areas of enterprise data lakes such as data ingestion, event-driven big data platforms, machine learning (ML), and advanced analytics. Programmatic access is possible today using ADLS Gen2 REST APIs, Blob REST APIs, or capabilities via Multi-Protocol Access. As part of our developer ecosystem journey, our goal is to make customer application development for programmatic access easier than ever before.

Towards this goal, we’re announcing the general availability of Python, .NET, Java, and JS filesystem SDKs for Azure Data Lake Storage (ADLS) Gen2 in all Azure regions. This includes support for CRUD operations for filesystem, directories, files, and permissions with filesystem semantics for ADLS Gen2. Customers can now use this familiar filesystem programming model to simplify application development for ADLS Gen2. These filesystem SDKs streamline our customers’ ability to ingest, manage, and analyze data for ADLS Gen2 and help them gain insights at cloud

Share

04

Mar

Announcing preview of Backup Reports

We recently announced a new solution, Backup Explorer, to enable you as a backup administrator to perform real-time monitoring of your backups, helping you achieve increased efficiency in your day-to-day operations.

But what if you could also be proactive in the way you manage your backup estate? What if there was a way to unlock the latent power of your backup metadata to make more informed business decisions?

For instance, any business would be well-served following a systematic way of forecasting backup usage. Often, this involves analyzing how backup storage has increased over time for a given tenant, subscription, resource group, or for individual workloads. Such analysis requires the paired ability to aggregate data over a long period of time and present it in a way that allows the reader to quickly derive insights.

Today, we are pleased to announce the public preview of Backup Reports. Leveraging Azure Monitor Logs and Azure Workbooks, Backup Reports serve as a one-stop destination for tracking usage, auditing of backups and restores, and identifying key trends at different levels of granularity.

With our reports, you can answer questions including ‘Which Backup Item(s) consume the most storage?’, ‘Which machines have had consistently misbehaving backups?’, ‘What

Share

18

Feb

Azure Offline Backup with Azure Data Box now in preview

An ever-increasing number of enterprises, even as they adopt a hybrid IT strategy, continue to retain mission-critical data on-premises and look towards the public cloud as an effective offsite for their backups. Azure Backup—Azure’s built-in data-protection solution, provides a simple, secure, and cost-effective mechanism to backup these data-assets over the network to Azure, while eliminating on-premises backup infrastructure. After the initial full backup of data, Azure Backup transfers only incremental changes in the data, thereby delivering continued savings on both network and storage.

With the exponential growth in critical enterprise data, the initial full backups are reaching terabyte scale. Transferring these large full-backups over the network, especially in high-latency network environments or remote offices, may take weeks or even months. Our customers are looking for more efficient ways beyond fast networks to transfer these large initial backups to Azure. Microsoft Azure Data Box solves the problem of transferring large data sets to Azure by enabling the “offline” transfer of data using secure, portable, and easy-to-get Microsoft appliances.

Announcing the preview of Azure Offline Backup with Azure Data Box

Today, we are thrilled to add the power of Azure Data Box to Azure Backup, and announce the preview program for offline

Share

05

Feb

https://azure.microsoft.com/blog/backup-explorer-now-available-in-preview/As organizations continue to expand their use of IT and the cloud, protecting critical enterprise data becomes extremely important. And if you are a backup admin on Microsoft Azure, being able to efficiently monitor backups on a daily basis is READ MORE

Share

16

Dec

Better performance with bursting enhancement on Azure Disks

We introduced the preview of bursting support on Azure Premium SSD Disks, and new disk sizes 4/8/16 GiB on both Premium & Standard SSDs at Microsoft Ignite in November. We would like to share more details about it. With bursting, eligible Premium SSD disks can now achieve up to 30x of the provisioned performance target, better handling for spiky workloads. If you have workloads running on-premises with less predictable disk traffic, you can migrate to Azure and improve your overall performance taking advantage of bursting support.

Disk bursting is enforced on a credit based system, where you will accumulate credits when traffic is below provisioned target and consume credit when it exceeds provisioned. You can best leverage the capability in these scenarios below:

OS disks to accelerate virtual machine (VM) boot: You can expect to experience a boost as part of VM boot where reads to the OS disk may be issued at a higher rate. If you are hosting cloud workstations on Azure, your applications launch time can potentially be reduced taking advantage of additional disk throughput. Data disks to accommodate spiky traffic: Some production operations trigger spikes of disk input/output (IO) by design. For example, if you conduct

Share

03

Dec

Extended filesystem programming capabilities in Azure Data Lake Storage

Since the general availability of Azure Data Lake Storage Gen2 in February 2019, customers have been getting insights at cloud scale faster than ever before. Integration to analytics engines is critical for their analytics workloads and equally important is the ability to programmatically ingest, manage, and analyze data. This ability is critical for key areas of enterprise data lakes such as data ingestion, event-driven big data platforms, machine learning, and advanced analytics. Programmatic access is possible today using Azure Data Lake Storage Gen2 REST APIs or Blob REST APIs. In addition, customers can enable continuous integration and continuous delivery (CI/CD) pipelines using Blob PowerShell and CLI capabilities via multi-protocol access. As part of the journey to enable our developer ecosystem, our goal is to make customer application development easier than ever before.

We are excited to announce the public preview of .NET SDK, Python SDK, Java SDK, PowerShell, and CLI for filesystem operations for Azure Data Lake Storage Gen2. Customers who are used to the familiar filesystem programming model can now implement this model using .NET, Python, and Java SDKs. Customers can also now incorporate these filesystem operations into their CI/CD pipelines using PowerShell and CLI, thereby enriching CI/CD pipeline

Share

02

Dec

SAP HANA backup using Azure Backup is now generally available

Today, we are sharing the general availability of Microsoft Azure Backup’s solution for SAP HANA databases in the UK South region.

Azure Backup is Azure’s native backup solution, which is BackInt certified by SAP. This offering aligns with Azure Backup’s mantra of zero-infrastructure backups, eliminating the need to deploy and manage backup infrastructure. You can now seamlessly backup and restore SAP HANA databases running on Microsoft Azure Virtual Machines (VM) — M series Virtual Machine is also supported, and leverage enterprise management capabilities that Azure Backup provides.

Benefits 15-minute Recovery Point Objective (RPO): Recovery of critical data of up to 15 minutes is possible. One-click, point-in-time restores: Easy to restore production data on SAP HANA databases to alternate servers. Chaining of backups and catalogs to perform restores is all managed by Azure behind the scenes. Long-term retention: For rigorous compliance and audit needs, you can retain your backups for years, based on the retention duration, beyond which the recovery points will be pruned automatically by the built-in lifecycle management capability. Backup Management from Azure: Use Azure Backup’s management and monitoring capabilities for improved management experience.

Watch this space for more updates on GA rollout to other regions. We are currently

Share

26

Nov

Multi-protocol access on Data Lake Storage now generally available

We are excited to announce the general availability of multi-protocol access for Azure Data Lake Storage. Azure Data Lake Storage is a unique cloud storage solution for analytics that offers multi-protocol access to the same data. This is a no-compromise solution that allows both the Azure Blob Storage API and Azure Data Lake Storage API to access data on a single storage account. You can store all your different types of data in one place, which gives you the flexibility to make the best use of your data as your use case evolves. The general availability of multi-protocol access creates the foundation to enable object storage capabilities on Data Lake Storage. This brings together the best of both object storage and Hadoop Distributed File System (HDFS) to enable scenarios that were not possible until today without data copy.

Broader ecosystem of applications and features

Multi-protocol access provides a powerful foundation to enable integrations and features for Data Lake Storage. Existing object storage applications and connectors can now be used to access data stored in Data Lake Storage with no changes. This vastly accelerated the integration of Azure services and the partner ecosystem with Data Lake Storage. We are also

Share

21

Nov

https://azure.microsoft.com/blog/azure-backup-support-for-sql-server-2019-and-restore-as-files/As SQL Server 2019 continues to push the boundaries of availability, performance, and data intelligence, a centrally managed, enterprise-scale backup solution is imperative to ensure the protection of all that data. This is especially true if you are running the READ MORE

Share