Last September, at SNIA’s Storage Developer’s Conference, I presented a prototype of the Project Denali SSD. Project Denali drives provide the flexibility needed to optimize for the workloads of a wide variety of cloud applications, the simplicity to keep pace with rapid innovations in NAND flash memory and application design, and the scale required for multitenant hardware that is so common in the cloud.
This month, I’m attending the Open Compute Project (OCP) U.S. Summit 2018 to begin formalization of the specification that will define the interface to Project Denali drives. Once in place, the specification will allow both hardware vendors and cloud providers to build and release their final products. The specification defines a new abstraction, which separates the roles of managing NAND, and managing data placement. The former will remain in the hardware – close to the NAND and in the product that reinvents itself with every new generation of NAND. The latter, once separated from the NAND management algorithms, will be allowed to follow its own schedule for innovation, and won’t be prone to bugs introduced by product cycles that track solely with NAND generations.
The specification, with the refactored algorithms it requires, will give a renewed
You have heard the hype around serverless computing, read countless online articles about its benefits, and heard industry pundits find new adjectives to describe its pathbreaking nature on a frequent basis. Are you now ready to dig into what you can actually do with real code using serverless computing cloud services like Azure Functions? If so, you should download and read the free Azure Serverless Computing Cookbook that describes with rich code examples, how to solve common problems using serverless Azure Functions.
However, if you need a little more motivation, read on.
Let us get the basics out of the way first. Serverless computing enables:
Full abstraction of servers: Focusing on your application code, not on servers running the code. Instant event-driven scalability: Not worrying about scaling up infrastructure as traffic grows. Pay-per-use: Paying only for the time your code is running and the resources it consumes
However, as a developer there is so much more to it that you should care about.
Your cloud journey can start small
You don’t really have to bet your entire application portfolio on this new way of building software all at once. The good thing about the Functions-as-a-Service (FaaS) model provided
Microsoft and 21Vianet have agreed to extend their partnership to provide world-class public cloud services to Chinese customers. Combining Microsoft’s global technological leadership and 21Vianet’s local operations expertise, Microsoft Azure and Office 365 operated by 21Vianet have achieved unprecedented, robust three-digit growth for nearly four consecutive years in China. This breakthrough cooperation model pioneered by Microsoft and 21Vianet has been recognized as an effective and successful method for a legal and compliant operation of international cloud services in China.
Announced in 2013 and officially launched in March 2014, Microsoft was the first international public cloud provider to bring its cloud technology to China in partnership with 21Vianet. Customers and partners range from established Chinese brands such as Haier, Lenovo, and Huawei to emerging powerhouses such as smartphone manufacturer Xiaomi, bike-share company Mobike, automobile manufacturer BYD, world-leading sporting goods company Amer Sports and Arcplus Data & Innovation Technology, an integrated solutions provider for construction engineering industry.
Office 365 operated by 21Vianet was launched in China in April 2014 and now ranks # 1 in China’s SaaS market. Huawei, Tencent, and Pactera currently use Office 365 operated by 21Vianet to empower their employees and optimize their daily business operations.
Microsoft Azure and
This blog post is co-authored by Joachim Hafner, Cloud Solution Architect
We’re pleased to offer a free e-book for those of you who are new to Azure or in the beginning stages of planning a cloud migration: the Azure Strategy and Implementation Guide for IT Organizations. As cloud solution architects, we hear a lot of the same questions from customers who are starting to think about their cloud implementation process. This motivated us to write this e-book, which provides guidance in the form of a blueprint which customers can follow to help form their cloud strategy.
Whether you are starting with Azure or doing more general research regarding how IT teams navigate cloud implementation, this guide offers a balance of broadly applicable advice and Azure specifics for you to consider. Here is an overview of what is covered:
Chapter 1: Governance – This chapter covers the starting points, from the aspirational “digital transformation” to the important tactical steps of administration and resource naming conventions. Get an overview of topics such as envisioning, to cloud readiness, administration, and security standards and policy. Chapter 2: Architecture – This section takes a longer look at security, touches on cloud design patterns, and
If you need to get up to speed on how to work best in a hybrid environment, we’ve collected five of our most-viewed webinars from the past year for your binge-watching pleasure. These on-demand sessions cover a variety of topics essential to individuals who find themselves needing to work on their hybrid mixed on-premise and cloud strategy:
1. Delivering innovation with the right hybrid cloud solution
If you don’t yet have a plan to take control of your mixed on-premises and cloud infrastructure, simplifying each user’s identity to a single credential can help you easily manage and secure services and create a data platform that simplifies security.
The on-demand delivering innovation with the right hybrid cloud solution webinar covers how to:
Consolidate identities and create a consistent data infrastructure. Unify development with help from Azure Stack application patterns. Select and use the best data platform, no matter your cloud model. 2. Migrating to a hybrid cloud environment: An insider’s walkthrough of 3 key methods
Your company has created a cloud strategy, or maybe you are playing catch-up to an employee-driven move to the cloud. Completing that move in an orderly way is a top priority.
If you’re in Warsaw, Poland March 12-15, you’ll want to come and join Microsoft at Supercomputing Frontiers Europe. This conference is a great opportunity to come together with high-performance computing leaders and practitioners. Microsoft will be there to talk about how Azure enables our customers to run true HPC workloads in the cloud.
Associate Professor Dariusz Mrożek from Silesian University of Technology in Gliwice will deliver a keynote titled “Accelerating scientific discoveries in protein bioinformatics with Microsoft Azure cloud platform”. In addition, we will be delivering a half-day of hands-on workshops on March 15. The workshop will include:
Deploying Linux RDMA Infiniband cluster with Azure CLI. Deploying HPC cluster from CycleCloud template. Azure Batch AI training lab. OpenFOAM simulation in Azure Marketplace UberCloud container.
The workshop will demonstrate different ways of using Azure for running HPC jobs. Instructors will start with an introductory presentation of HPC services on Azure and the workshop environment, followed by a hands-on lab session in which they will go step-by-step with the participants through the lab content. Participants will also receive free Azure passes and will be able to execute the labs following the instructor guidance or at their own pace.
If you’re coming to
We are thrilled to announce the availability of B-series VM’s, burstable VM’s in Azure Container Service (AKS).
Burstable VM’s (B-series) are significantly cheaper compared to standard and optimal recommended VM’s like Standard_DS2_V2. B-series VM’s are particularly suited for development and test environments where performance requirements are bursts rather than consistent. In fact, B-Series provides the cheapest cost with bursts CPU usage and thus reduces development and test environment costs significantly. We hope that this addition will significantly reduce the cost of learning Kubernetes AKS, building proof of concepts on Azure Container Service (AKS), running dev/test workloads, etc.
The following configurations are available today.
SKU Type VCPUS GB Ram Data Disks Max IOPS Local SSD B1s Standard 1 1 2 800 2GB B1ms Standard 1 2 2 1600 4GB B2s Standard 2 4 4 3200 8GB B2ms Standard 2 8 4 4800 16GB B4ms Standard 4 16 8 7200 32GB B8ms Standard 8 32 16 10800 64GB
In comparison, a Standard_DS2_V2 node costs greater than five times the B1/B2 SKU’s today. Check the latest VM pricing.
To get started log on to the Azure portal and search for Container Service (managed). As you follow the AKS create cluster workflow, you will
The Graphical Execution Plan feature within SQL Server Management Studio (SSMS) is now supported for SQL Data Warehouse (SQL DW)! With a click of a button, you can create a graphical representation of a distributed query plan for SQL DW.
Before this enhancement, query troubleshooting for SQL DW was often a tedious process, which required you to run the EXPLAIN command. SQL DW customers can now seamlessly and visually debug query plans to identify performance bottlenecks directly within the SSMS window. This experience extends the query troubleshooting experience by displaying costly data movement operations which are the most common reasons for slow distributed query plans. Below is a simple example of troubleshooting a distributed query plan with SQL DW leveraging the Graphical Execution Plan.
The view below displays the estimated execution plan for a query. As we can see, this is an incompatible join which occurs when there is a join between two tables distributed on different columns. An incompatible join will create a ShuffleMove operation, where temp tables will be created on every distribution to satisfy the join locally before streaming the results back to the user. The ShuffleMove has become a performance bottleneck for this query:
This is the first of a blog series which presents success stories from customers with Azure Backup. Here we discuss how Azure Backup helped Russell Reynolds
Russell Reynolds is a global leadership and executive search firm which helps their clients with assessment, executive search, and leadership transitions within boards of directors, chief executive officers, and other key roles within the C-suite. Having moved to Azure to reduce their IT and datacenter costs, the company started to look for an alternative to their tape backups which was proving both cumbersome and expensive. Enter Azure Backup.
How Azure Backup helped
With Microsoft System Center 2012 R2 Data Protection Manager they backup their VMWare workloads locally and to Azure cloud where they can be retained up to 99 years eliminating their needs for tapes. They used the Azure Backup Offline Seeding capability to copy their initial 10 TB of data to cloud. Thereafter, Azure Backup transfers only incremental data during daily backups, reducing storage consumption and need for huge bandwidth.
“Even though we used very reputable partners for tape handling, it always made us nervous when our data left our facilities”, says David W.Pfister, Director of Global Distibuted Infrastructure and Client
A few months ago, we announced we were performing a compliance assessment on Microsoft Azure Stack, today we are happy to share that the compliance assessment is done and available to you.
Knowing that preparing compliance paperwork is a tedious task, we precompiled the documentation for our customers. Since Azure Stack is delivered as an integrated system through hardware partners, we are in a unique position to perform a formal compliance assessment of Azure Stack that applies to all our customers. This resulted in a set of precompiled compliance documents that customers can now use to accelerate their compliance certification process.
We are glad to announce that Coalfire, a Qualified Security Assessor (QSA) and independent auditing firm, has audited and evaluated Azure Stack Infrastructure against the technical controls of PCI-DSS and the CSA Cloud Control Matrix, and found that Azure Stack satisfies the applicable controls.
In the assessor’s words:
“It is Coalfire’s opinion that Microsoft Azure Stack integrated system, reviewed between July 2017 and October 2017, can be effective in creating a PCI DSS compliant infrastructure and to assist in a comprehensive program of compliance with PCI DSS version 3.2.”
“It is Coalfire’s opinion that Microsoft Azure Stack as deployed