Category Archives : Cloud Strategy

18

Jun

First Microsoft cloud regions in Middle East now available

This blog post was co-authored by Paul Lorimer, Distinguished Engineer, Office 365.

Azure and Office 365 generally available today, Dynamics 365 and Power Platform available by end of 2019

Today, Microsoft Azure and Microsoft Office 365 are taking a major step together to help support the digital transformation of our customers. Both Azure and Office 365 are now generally available from our first cloud datacenter regions in the Middle East, located in the United Arab Emirates (UAE). Dynamics 365 and Power Platform, offering the next generation of intelligent business applications and tools, are anticipated to be available from the cloud regions in UAE by the end of 2019.

The opening of the new cloud regions in Abu Dhabi and Dubai marks the first time Microsoft will deliver cloud services directly from datacenter locations in UAE and expands upon Microsoft’s existing investments in the Gulf and the wider Middle East region. By delivering the complete Microsoft cloud – Azure, Office 365, and Dynamics 365 – from datacenters in a given geography, we offer scalable, highly available, and resilient cloud services for organizations while helping them meet their data residency, security, and compliance needs.

Our new cloud regions adhere to Microsoft’s

Share

13

Jun

Customers get unmatched security with Windows Server and SQL Server workloads in Azure

Customers such as Allscripts, Chevron, J.B. Hunt, and thousands of others are migrating their important workloads to Azure where they find unmatched security. While understanding cloud security is initially a concern to many, after digging in, customers often tell us the security posture they can set up within Azure is easier to implement and far more comprehensive than what they can provide for in other environments.

Azure delivers multiple layers of security, from the secure foundation in our physical datacenters, to our operational practices, to engineering processes that follow industry standard Mitre guidelines. On top of that, customers can choose from a variety of self-service security services that work for both Azure and on-premises workloads. We employ more than 3,500 cybersecurity professionals and spend $1 billion annually on security to help protect, detect, and respond to threats – delivering security operations that work 24x7x365 for our customers.

Let’s look at some examples of how Azure delivers unmatched security for your Windows Server and SQL Server workloads.

The broadest built-in protections across hybrid environments with Azure Security Center

Customers can get the broadest built-in protection available across both cloud and on-premises through Azure Security Center. This includes security recommendations for virtual

Share

04

Jun

Announcing self-serve experience for Azure Event Hubs Clusters

For businesses today, data is indispensable. Innovative ideas in manufacturing, health care, transportation, and financial industries are often the result of capturing and correlating data from multiple sources. Now more than ever, the ability to reliably ingest and respond to large volumes of data in real time is the key to gaining competitive advantage for consumer and commercial businesses alike. To meet these big data challenges, Azure Event Hubs offers a fully managed and massively scalable distributed streaming platform designed for a plethora of use cases from telemetry processing to fraud detection.

Event Hubs has been immensely popular with Azure’s largest customers and now even more so with the recent release of Event Hubs for Apache Kafka. With this powerful new capability, customers can stream events from Kafka applications seamlessly into Event Hubs without having to run Zookeeper or manage Kafka clusters, all while benefitting from a fully managed platform-as-a-service (PaaS) with features like auto-inflate and geo-disaster recover. As the front door to Azure’s data pipeline, customers can also automatically Capture streaming events into Azure Storage or Azure Data Lake, or natively perform real-time analysis on data streams using Azure Stream Analytics.

For customers with the most demanding streaming

Share

03

Jun

Customize your automatic update settings for Azure Virtual Machine disaster recovery

In today’s cloud-driven world, employees are only allowed access to data that is absolutely necessary for them to effectively perform their job. This limited access is especially important in scenarios where it’s difficult to monitor access behaviors, like if you have many employees and/or engage vendors. Access is usually based on the job responsibility, authority, and capability. As a result, some job profiles will not have access to certain data or rights to perform specific actions if they do not need it to fulfill their responsibilities. The ability to hence control access but still be able to perform job duties aligning to the infrastructure administrator profile is becoming more relevant and frequently requested by customers.

You asked, we listened!

When we released the automatic update of agents used in disaster recovery (DR) of Azure Virtual Machines (VMs), the most frequent feedback we received was related to access control. Customers had DR admins who were given just enough rights to execute operations to enable, failover, or test DR. While they wanted to enable automatic updates and avoid the hassle of having to monitor for monthly updates and manually upgrade the agents, they didn’t want to give the DR admin contributor access

Share

03

Jun

Azure Stack IaaS – part nine

This blog was co-authored by Aparna Vishwanathan, Senior Program Manager, Azure Stack and Tiberiu Radu, Senior Program Manager, Azure Stack.

Build on the success of others

Before we built Azure Stack, our program manager team called a lot of customers who were struggling to create a private cloud out of their virtualization infrastructure. I was surprised to learn that the few that managed to overcome the technical and political challenges of getting one set up had trouble getting their business units and developers to use it. It turns out they created what we now call a snowflake cloud, a cloud unique to just their organization. But their developers wanted what the same functionality they get in the public cloud, which is an ecosystem full of rich documentation, examples, templates, forums, demos, and more.

This is one of the main problems we were looking to solve with Azure Stack. A local cloud that has not only automated deployment and operations, but also is consistent with Azure so that developers and business units can tap into the ecosystem. In this blog post I will cover the different ways you can tap into the Azure ecosystem to get the most value out of

Share

13

May

Azure SQL Database Edge: Enabling intelligent data at the edge

The world of data changes at a rapid pace, with more and more data being projected to be stored and processed at the edge. Microsoft has enabled enterprises with the capability of adopting a common programming surface area in their data centers with Microsoft SQL Server and in the cloud with Azure SQL Database. We note that latency, data governance and network connectivity continue to gravitate data compute needs towards the edge. New sensors and chip innovation with analytical capabilities at a lower cost enable more edge compute scenarios to drive higher agility for business.

At Microsoft Build 2019, we announced Azure SQL Database Edge, available in preview, to help address the requirements of data and analytics at the edge using the performant, highly available and secure SQL engine. Developers will now be able to adopt a consistent programming surface area to develop on a SQL database and run the same code on-premises, in the cloud, or at the edge.

Azure SQL Database Edge offers:

Small footprint allows the database engine to run on ARM and x64 devices via the use of containers on interactive devices, edge gateways, and edge servers. Develop once and deploy anywhere scenarios through a common

Share

08

May

Connecting the colossal: How to scale innovation with serverless integration

Starting the process of migrating to the cloud can be daunting. Legacy systems that are colossal in scale often overwhelm the average team tasked with the mission of digital transformation. How can they possibly untangle years of legacy code to start this new digital transformation initiative? Not only are these systems colossal in scale, but also colossal in terms of business importance. Enterprise applications like SAP and IBM, are integral to the daily rhythm of business. A seemingly simple mistake can result in catastrophic consequence.

Over the past year, Azure Integration Services has been reflecting on solutions to help with these challenges and we’re excited to announce new capabilities:

Developer focused – Improved the developer experience inside Logic Apps by allowing you to directly write code as a step inside a Logic App. Enterprise ready – Added new migration and modernization scenarios with the general availability of our new-and-improved SAP connector. Serverless first – Better integration between API Management and Azure Functions makes it even easier to create and manage serverless integrations and applications. The challenges facing customers

Over the past year, we’ve had the opportunity to meet with and hear from customers in-person to discuss the biggest challenges facing

Share

01

May

Best practices in migrating SAP applications to Azure – part 3

This is the third blog in a three-part blog post series on best practices in migrating SAP to Azure.

BWoH and BW/4HANA on Azure

Many SAP customers’ compelling events in their migration of SAP HANA to the cloud have been driven by two factors:

End of life first-generation SAP HANA appliances causing customers to re-evaluate their platform. The desire to take advantage of the early value proposition of SAP Business Warehouse (BW) on HANA in a flexible TDI model over traditional databases and later BW/4HANA.

As a result, numerous initial migrations of SAP HANA to Microsoft Azure have focused on SAP BW to take advantage of SAP HANA’s in-memory capability for the BW workload. This means migration of the BW application to utilize SAP HANA at the database layer, and eventually the more involved migration of BW on HANA to BW/4HANA.

The SAP Database Migration Option (DMO) with System Move option of SUM, used as part of the migration allows customer the options to perform the migration in a single step, from source system on-premises, or to the target system residing in Microsoft Azure, minimizing overall downtime.

As with the S/4HANA example my colleague Marshal used in “Best practices in

Share

01

May

Azure Stack IaaS – part seven

https://azure.microsoft.com/blog/azure-stack-iaas-part-seven-2/

Share

25

Apr

Best practices in migrating SAP applications to Azure – part 2

This is the second blog in a three-part blog post series on best practices in migrating SAP to Azure.

Journey to SAP S/4HANA from SAP Business Suite

A common scenario where SAP customers can experience the speed and agility of the Azure platform is the ability to migrate from a SAP Business Suite running on-premises to SAP S/4HANA in the cloud. This scenario is a two-step process. The first step being migration from enterprise resource planning (ERP) on-premises to the Suite on HANA in Azure, and then converting from Suite on HANA to S/4HANA.

Using the cloud as the destination for such a migration project has the potential to save organizations millions of dollars in upfront cost for infrastructure equipment, and can shave roughly around 12 to 16 weeks off the project schedule as the most complex components of the infrastructure are already in place and can easily be provisioned when required. You can see where these time savings come from by looking at the time it takes to go through the request for proposal (RFP) process, to procure expensive servers with large memories, or potentially to procure dedicated appliances with only a five year lifespans such as storage

Share