It’s no secret that organizations consider data as one of their most valuable assets and are investing to build their capability for data-driven decision making. It is challenging to manage a flexible and cost-effective data estate on-premises, and we are seeing customers embrace Azure for its best-in-class analytics solutions rapidly. However, migrating analytics workloads can be complex and challenging. The thought of moving large volumes of critical business data has often been too daunting or too expensive for a lot of enterprises. Add to that the challenges inherent in updating all the existing data ingestion and consumption pipelines from your traditional Hadoop environment, and it’s no surprise that many organizations have been reticent to begin their cloud migration.
We are delighted to partner with WANDisco to provide a turnkey solution for Hadoop-oriented data lake migrations, WANDisco LiveData Platform for Azure. WANdisco LiveData Platform for Azure is our preferred solution for Hadoop to Azure migrations. With LiveData Platform for Azure, you can deploy and manage your data lake migrations using the same Azure management experience you enjoy today through the Azure portal and Azure CLI. You begin your data lake migration in minutes and not weeks or months like you would with
Business disruptions, tactical pivots, and remote work have all emphasized the critical role that analytics plays in all organizations. Uncharted situations demand proven performance insights so that businesses can quickly determine what is and is not working. In recent months, the urgency for business-guiding insights has only been heightened, leading to a need for real-time analytics solutions. Equally important is the need to discover and share these insights in the most cost-effective manner.
Not only has COVID-19 been a challenge to world health but also has created new economic challenges to businesses worldwide. These challenges have resulted in an increased need for tools that quickly deliver insights to business leaders—empowering informed decisions. This is where Microsoft Azure Synapse Analytics can help.
New circumstances demand new solutions
Azure Synapse Analytics is a new type of analytics platform that enables you to accelerate your time-to-insight with a unified experience and—just as important—save on costs while doing so. It is up to 14 times faster and costs 94 percent less than other cloud providers. Let’s dive into how Azure Synapse can help you respond, adapt, and save.
Respond to disruption and adapt to a new normal
History shows that proven analytics technologies,
This post was co-authored by Jie Feng Principal Program Manager, and Sumi Venkitaraman Senior Product Manager, Microsoft Azure.
Microsoft Azure Data Share is an open, easy, and secure way to share data at scale by enabling organizations to share data in-place or as a data snapshot. Microsoft Azure Data Explorer is a fast and highly scalable data analytics service for telemetry, time-series, and log data.
Fueled by digital transformation, modern organizations want to increasingly enable fluid data sharing to drive business decisions. Seamlessly sharing data for inter-departmental and inter-organizational collaboration can unlock tremendous competitive advantage. Maintaining control and visibility, however, remains an elusive goal. Even today, data is shared using File Transfer Protocols (FTPs), application programming interfaces (APIs), USB devices, and email attachments. These methods are simply not secure, cannot be governed, and are inefficient at best.
Azure Data Share in-place Sharing for Azure Data Explorer, now generally available, enables you to share big data easily and securely between internal departments and with external partners, vendors, or customers for near real-time collaboration.
Once data providers share data, recipients (data consumers) always have the latest data without needing any additional intervention. Additionally, data providers maintain control over the sharing and
More than ever before, companies are relying on their big data and artificial intelligence (AI) systems to find new ways to reduce costs and accelerate decision-making. However, customers using on-premises systems struggle to realize these benefits due to administrative complexity, inability to scale their fixed infrastructure cost-effectively, and lack of a shared collaborative environment for data engineers, data scientists and developers.
To make it easier for customers to modernize their on-premises Spark and big data workloads to the cloud, we’re announcing a new migration offer with Azure Databricks. The offer includes:
Up to a 52 percent discount over the pay-as-you-go pricing when using the Azure Databricks Unit pre-purchase plans. This means that customers can free themselves from the complexities and constraints of their on-premises solutions and realize the benefits of the fully managed Azure Databricks service at a significant discount. Free migration assessment for qualified customers.
Azure Databricks is a fast, easy, and collaborative Apache Spark-based service that simplifies building big data and AI solutions. Since its debut two years ago, Azure Databricks has experienced significant adoption from customers, such as Shell, Cerner, Advocate Aurora Health, and Bosch, which are using it to run mission-critical big data and AI workloads.
This post is part 2 of a two-part series about how organizations use Azure Cosmos DB to meet real world needs, and the difference it’s making to them. In part 1, we explored the challenges that led the Microsoft Office Licensing Service team to move from Azure Table storage to Azure Cosmos DB, and how it migrated its production workload to the new service. In part 2, we examine the outcomes resulting from the team’s efforts.
Strong benefits with minimal effort
The Microsoft Office Licensing Service (OLS) team’s migration from Azure Table storage to Azure Cosmos DB was simple and straightforward, enabling the team to meet all its needs with minimal effort.
An easy migration
In moving to Azure Cosmos DB, thanks to its Table API, the OLS team was able to reuse most of its data access code, and the migration engine they wrote to avoid any downtime was fast and easy to build.
Danny Cheng, a software engineer at Microsoft, who leads the OLS development team explains:
“The migration engine was the only real ‘new code’ we had to write. And the code samples for all three parts are publicly available, so it’s not like we had to
This post is part 1 of a two-part series about how organizations use Azure Cosmos DB to meet real world needs, and the difference it’s making to them. In part 1, we explore the challenges that led the Microsoft Office Licensing Service team to move from Azure Table storage to Azure Cosmos DB, and how it migrated its production workload to the new service. In part 2, we examine the outcomes resulting from the team’s efforts.
The challenge: Limited throughput and other capabilities
At Microsoft, the Office Licensing Service (OLS) supports activation of the Microsoft Office client on millions of devices around the world—including Windows, Mac, tablets, and mobile. It stores information such as machine ID, product ID, activation count, expiration date, and more. OLS is accessed by the Office client more than more than 240 million times per day by users around the world, with the first call coming from the client upon license activation and then every 2-3 days thereafter as the client checks to make sure the license is still valid.
Until recently, OLS relied on Azure Table storage for its backend data store, which contained about 5 TB of data spread across 18 tables—with separate tables
This post is part 1 of a two-part series about how organizations use Azure Cosmos DB to meet real world needs and the difference it’s making to them. In part 1, we explore the challenges that led service developers for Minecraft Earth to choose Azure Cosmos DB and how they’re using it to capture almost every action taken by every player around the globe—with ultra-low latency. In part 2, we examine the solution’s workload and how Minecraft Earth service developers have benefited from building it on Azure Cosmos DB.
Extending the world of Minecraft into our real world
You’ve probably heard of the game Minecraft, even if you haven’t played it yourself. It’s the best-selling video game of all time, having sold more than 176 million copies since 2011. Today, Minecraft has more than 112 million monthly players, who can discover and collect raw materials, craft tools, and build structures or earthworks in the game’s immersive, procedurally generated 3D world. Depending on game mode, players can also fight computer-controlled foes and cooperate with—or compete against—other players.
In May 2019, Microsoft announced the upcoming release of Minecraft Earth, which began its worldwide rollout in December 2019. Unlike preceding games in the
This post is part 2 of a two-part series about out how organizations are using Azure Cosmos DB to meet real world needs and the difference it’s making to them. In part 1, we explored the challenges that led service developers for Minecraft Earth to choose Azure Cosmos DB and how they’re using it to capture almost every action taken by every player around the globe—with ultra-low latency. In part 2 we examine the solution’s workload and how Minecraft Earth service developers have benefited from building it on Azure Cosmos DB.
Geographic distribution and multi-region writes
Minecraft Earth service developers used the turnkey geographic distribution feature in Azure Cosmos DB to achieve three goals: fault tolerance, disaster recovery, and minimal latency—the latter achieved by also using the multi-master capabilities of Azure Cosmos DB to enable multi-region writes. Each supported geography has at least two service instances. For example, in North America, the Minecraft Earth service runs in the West US and East US Azure regions, with other components of Azure used to determine which is closer to the user and route traffic accordingly.
Nathan Sosnovske, a Senior Software Engineer on the Minecraft Earth services development team explains:
“With Azure available
Processing Big data in real-time is an operational necessity for many businesses. Azure Stream Analytics is Microsoft’s serverless real-time analytics offering for complex event processing.
We are excited and humbled to announce that Microsoft has been named a leader in The Forrester Wave™: Streaming Analytics, Q3 2019. Microsoft believes this report truly reflects the market momentum of Azure Stream Analytics, satisfied customers, a growing partner ecosystem and the overall strength of our Azure cloud platform. You can access the full report here.
The Forrester Wave™: Streaming Analytics, Q3 2019
Forrester Wave™: Streaming Analytics, Q3 2019 report evaluated streaming analytics offerings from 11 different solution providers and we are honored to share that that Forrester has recognized Microsoft as a Leader in this category. Azure Stream Analytics received the highest possible score in 12 different categories including Ability to execute, Administration, Deployment, Solution Roadmap, Customer adoption and many more.
The report states, “Microsoft Azure Stream Analytics has strengths in scalability, high availability, deployment, and applications. Azure Stream Analytics is an easy on-ramp for developers who already know SQL. Zero-code integration with over 15 other Azure services makes it easy to try and therefore adopt, making the product the
Today, businesses are forced to maintain two types of analytical systems, data warehouses and data lakes. Data warehouses provide critical insights on business health. Data lakes can uncover important signals on customers, products, employees, and processes. Both are critical, yet operate independently of one another, which can lead to uninformed decisions. At the same time, businesses need to unlock insights from all their data to stay competitive and fuel innovation with purpose. Can a single cloud analytics service bridge this gap and enable the agility that businesses demand?
Azure Synapse Analytics
Today, we are announcing Azure Synapse Analytics, a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs.
Simply put, Azure Synapse is the next evolution of Azure SQL Data Warehouse. We have taken the same industry-leading data warehouse to a whole new level of performance and capabilities. In fact, it’s the first and only analytics system to have