Last year, we released Blob-Level Tiering which allows you to transition blobs between the Hot, Cool, and Archive tiers without moving data between accounts. Both Blob-Level Tiering and Archive Storage help you optimize storage performance and cost. You asked us to make it easier to manage and automate, so we did. Today we are excited to announce the public preview of Blob Storage lifecycle management so that you can automate blob tiering and retention with lifecycle management policies.
Data sets have unique lifecycles. Some data is accessed often early in the lifecycle, but the need for access drops drastically as the data ages. Some data remain idle in the cloud and is rarely accessed once stored. Some data expire days or months after creation while other data sets are actively read and modified throughout their lifetimes. Azure Blob Storage lifecycle management offers a rich, rule-based policy which you can use to transition your data to the best access tier and to expire data at the end of its lifecycle.
Lifecycle management policy helps you:
Transition blobs to a cooler storage tier (Hot to Cool, Hot to Archive, or Cool to Archive) to optimize for performance and cost Delete
ADLA now offers some new, unparalleled capabilities for processing files of any formats including Parquet at tremendous scale.
Previously: Handling tens of thousands of files is painful!
Many of our customers tell us that handling a large number of files is challenging – if not downright painful in all the big data systems that they have tried. Figure 1 shows the distribution of files in common data lake systems. Most files are less than one GB, although a few may be huge.
Figure 1: The pain of many small files
ADLA has been developed from a system that was originally designed to operate on very large files that have internal structure that help with scale-out, but it only operated on a couple of hundred to about 3,000 files. It also over-allocated resources when processing small files by giving one extract vertex to a file (a vertex is a compute container that will execute a specific part of the script on a partition of the data and
Processing big data in real-time is now an operational necessity for many businesses. Azure Stream Analytics is Microsoft’s serverless real-time analytics offering for complex event processing. It enables customers to unlock valuable insights and gain competitive advantage by harnessing the power of big data. Here are eight reasons why you should choose ASA for real-time analytics.
1. Fully integrated with Azure ecosystem: Build powerful pipelines with few clicks
Whether you have millions of IoT devices streaming data to Azure IoT Hub or have apps sending critical telemetry events to Azure Event Hubs, it only takes a few clicks to connect multiple sources and sinks to create an end-to-end pipeline. Azure Stream Analytics provides best-in-class integration to store your output, like Azure SQL Database, Azure Cosmos DB, Azure Data Lake Store. It also enables you to trigger custom workflows downstream with Azure Functions, Azure Service Bus Queues, Azure Service Bus Topics, or create real-time dashboards using Power BI.
2. Developer productivity
One of the biggest advantages of Stream Analytics is the simple SQL-based query language with its powerful temporal constraints to analyze data in motion. Familiarity with SQL language is sufficient to author powerful queries. Azure Stream Analytics supports language extensibility
Last month, we announced the extension of Azure Security Center’s detection for Linux. This post aims to demonstrate how existing Windows detections often have Linux analogs. A specific example of this is the encoding or obfuscation of command-lines. Some of the reasons an attacker might wish to encode their commands include minimizing quoting/escaping issues when encapsulating commands in scripts and a basic means of hiding from host-based intrusion detection. These techniques have the additional benefit of avoiding the need to drop a file to disk, reducing the risk to an attacker of being detected by traditional anti-virus products.
Encoded PowerShell attacks on Windows
There are many examples of such behavior being used in attacks against Windows environments. A previous blog post highlights one such technique to encode PowerShell commands as base64. PowerShell actually makes this amazingly easy, allowing commands of the form:
powershell.exe -EncodedCommand dwByAGkAdABlAC0AbwB1AHQAcAB1AHQAIABFAG4AYwBvAGQAZQBkACAAUABvAHcAZQByAFMAaABlAGwAbAAgAHMAYwByAGkAcAB0AA==
The only real stumbling block being the requirement that the decoded command must be UTF-16 (hence the prevalence of ‘A’ in the resulting base64).
Encoded shell attacks on Linux
As with attacks on Windows systems, the same motivations exist for encoding commands on Linux systems. Namely, to avoid encapsulation issues with special characters and
Today marks the beginning of enforcement of the EU General Data Protection Regulation (GDPR), and I’m pleased to announce that we have released an unmatched array of new features and resources to help support compliance with the GDPR and the policy needs of Azure customers.
New offerings include the general availability of the Azure GDPR Data Subject Request (DSR) portal, Azure Policy, Compliance Manager for GDPR, Data Log Export, and the Azure Security and Compliance Blueprint for GDPR.
In our webcast today, President Brad Smith outlined our commitment to making sure that our products and services comply with the GDPR, including having more than 1,600 engineers across the company working on GDPR projects. As Brad noted, we believe privacy is a fundamental human right, and that individuals must be in control of their data. So I am pleased that Azure is part of keeping that commitment by being the only hyperscale cloud provider to offer the level of streamlined mechanisms and tools for GDPR compliance enforcement we are announcing today.
Azure Data Subject Request (DSR) portal enables you to fulfill GDPR requests. The DSR capability is generally available today through the Azure portal user interface, as well as through pre-existing
Azure has a thriving ecosystem where partners can publish many different offer types including virtual machine images, SaaS, solution templates, and managed applications, allowing customers to solve their unique problems in the way which best fits their scenario.
At Build, we announced a new category of offer in Azure Marketplace – container images
Azure customers can now discover and acquire secure and certified container images in Azure marketplace to build a container-based solution. All images available in Azure marketplace are certified and validated against container runtimes in Azure like managed Azure Kubernetes Service (AKS) allowing customers to build with confidence and deploy with flexibility.
For ISVs who have their applications offered in container-based images, this new offer type offers the opportunity to publish their solutions and reach Azure users who are building container-based architecture. This helps ISVs to publish with confidence through Azure Certified program and validation across industry-standard container formats and across the different Azure container services.
Get started with Azure Marketplace container images
After the recent general availability for Storage Explorer, we also added new features in the latest 1.1 release to align with Azure Storage platform:
Azurite cross-platform emulator Access tiers that efficiently consumes resources based on how frequently a blob is accessed The removal of SAS URL start time to avoid datacenter synchronization issues
Storage Explorer is a great tool for managing contents of your Azure storage account. You can upload, download, and manage blobs, files, queues, and Cosmos DB entities. Additionally, you may gain easy access to manage your Virtual Machine disks, work with either Azure Resource Manager or classic storage accounts, plus manage and configure cross-origin resource sharing (CORS) rules. Storage Explorer also works on public Azure, Sovereign Azure Cloud, as well as Azure Stack.
Let’s go through some example scenarios where Storage Explorer helps with your daily job.
Sign-in to your Azure Cloud from Storage Explorer
To get started using Storage Explorer, sign in to your Azure account and stay connected to your subscriptions. If you have an account for Azure, Azure Sovereign Cloud, or Azure Stack, you can easily sign-in to your account from Storage Explorer Add an Account dialog.
In addition, now Storage Explorer shares the
Azure Cloud Shell provides browser-based authenticated shell access to Azure from virtually anywhere. Cloud Shell gives the users a rich environment with common tools that is updated and maintained by Microsoft.
Bash in Cloud Shell that runs Bash shell on Ubuntu Linux, which was made generally available in November 2017
PowerShell in Cloud Shell that runs Windows PowerShell 5.1 on Windows Server Core and has been in preview since September 2017
In this post, we are listing the key upcoming changes to the PowerShell experience in Azure Cloud Shell, namely:
Faster startup time PowerShell Core 6 as the default experience Running on a Linux container Persistent Tool Settings Faster Startup Time
We are well-aware that the startup time of PowerShell in Azure Cloud Shell is well below the user’s expectation. For past couple of months, the team has been working hard to make significant improvements in this area. We expect to deliver multi-fold improvements in the startup time for PowerShell experience (and also make Bash experience faster).
Last week, we released an update to the Azure IoT Reference Architecture Guide. Our focus for the update was to bring the document forward to the latest Azure IoT cloud native recommended architecture and latest technology implementation recommendations. The updated guide includes an overview of the IoT space, recommended subsystem factoring for solutions, and prescriptive technology recommendations per subsystem. Technical content added includes coverage of topics such as microservices, containers, orchestrators (e.g. Kubernetes and Service Fabric), serverless usage, Azure Stream Analytics, and Edge devices. Major updates were made to the Stream Processing and Storage subsystem sections of the document covering rules processing and storage technology options on Azure across differing types of IoT solutions.
The IoT Architecture Guide aims to accelerate customers building IoT Solutions on Azure by providing a proven production ready architecture, with proven technology implementation choices, and with links to Solution Accelerator reference architecture implementations such as Remote Monitoring and Connected Factory. The document offers an overview of the IoT space, recommended subsystem factoring for scalable IoT solutions, prescriptive technology recommendations per subsystems, and detailed sections per subsystem that explore use cases and technology alternatives.
Future updates – please provide feedback and ask questions
We continue to expand the Azure Marketplace ecosystem. From April 16 to 30, 15 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
(Basic) Apache NiFi 1.6 on Centos 7.4: A CentOS 7.4 VM running a basic install of Apache NiFi 1.6 using default configurations. Once the virtual machine is deployed and running, Apache NiFi can be accessed by opening a web browser and entering: http://<IP>:8080/nifi in the address bar.
Debian Web Server and mariadb: A ready-to-deploy Debian Web Server with mariadb databases. A web server includes several parts that control how web users access hosted files. MariaDB is a fork of the MySQL relational database management system.
Jamcracker CSB Service Provider Version5: This service provider appliance is a cloud brokerage solution for SaaS and IaaS products. It automates order management, provisioning, and billing, and integrates to support ITSM, billing, ERP, and identity systems including Microsoft Active Directory.
MCubo Energy: MCubo Energy is a powerful platform that uses its own “best practices” to maximize your energy savings while safeguarding the environment. The proactive