Whether you’re a new student, a thriving startup, or the largest enterprise, you have financial constraints, and you need to know what you’re spending, where, and how to plan for the future. Nobody wants a surprise when it comes to the bill, and this is where Azure Cost Management and Billing comes in.
We’re always looking for ways to learn more about your challenges and how Azure Cost Management and Billing can help you better understand where you’re accruing costs in the cloud, identify and prevent bad spending patterns, and optimize costs to empower you to do more with less. Here are a few of the latest improvements and updates based on your feedback:
Prevent exceeding your budget with forecasted cost alerts New cost view for subscriptions What’s new in Cost Management Labs Demystifying cloud economics New ways to save money with Azure New videos and learning opportunities Documentation updates
Let’s dig into the details.
Prevent exceeding your budget with forecasted cost alerts
Staying on top of your costs is critical. Luckily, you can easily set up predefined alerts as you approach your budget in Azure Cost Management and Billing. Now you can define alerts based
Customers with qualifying subscription types* can now try the new serverless SQL pool and Apache Spark pool resources in Azure Synapse Analytics using free quantities until July 31, 2021. Sign-in to your Azure Synapse Analytics account to get started.
Serverless SQL pool
Customers can enjoy up to 10TB of free queries per month using the serverless SQL pool resource until July 31, 2021.
Free quantity Serverless *10 TB of free queries per month until July 31, 2021
The new flexible service model for query processing in Azure Synapse Analytics allows data teams to use both serverless and dedicated options. Organizations can now choose the most cost-effective option for each use case—enjoying the advantages of a data lake for quick data exploration with pay-per-query pricing and or a dedicated data warehouse for more predictable and mission-critical workloads.
A serverless SQL pool is serverless, so there’s no infrastructure to set up or clusters to maintain. A default endpoint for this service is provided within every Azure Synapse Analytics workspace, so customers can start querying data as soon as the workspace is created.
Apache Spark pool
Customers can enjoy up to 120 free vCore-hours per month
It’s no secret that organizations consider data as one of their most valuable assets and are investing to build their capability for data-driven decision making. It is challenging to manage a flexible and cost-effective data estate on-premises, and we are seeing customers embrace Azure for its best-in-class analytics solutions rapidly. However, migrating analytics workloads can be complex and challenging. The thought of moving large volumes of critical business data has often been too daunting or too expensive for a lot of enterprises. Add to that the challenges inherent in updating all the existing data ingestion and consumption pipelines from your traditional Hadoop environment, and it’s no surprise that many organizations have been reticent to begin their cloud migration.
We are delighted to partner with WANDisco to provide a turnkey solution for Hadoop-oriented data lake migrations, WANDisco LiveData Platform for Azure. WANdisco LiveData Platform for Azure is our preferred solution for Hadoop to Azure migrations. With LiveData Platform for Azure, you can deploy and manage your data lake migrations using the same Azure management experience you enjoy today through the Azure portal and Azure CLI. You begin your data lake migration in minutes and not weeks or months like you would with
Business disruptions, tactical pivots, and remote work have all emphasized the critical role that analytics plays in all organizations. Uncharted situations demand proven performance insights so that businesses can quickly determine what is and is not working. In recent months, the urgency for business-guiding insights has only been heightened, leading to a need for real-time analytics solutions. Equally important is the need to discover and share these insights in the most cost-effective manner.
Not only has COVID-19 been a challenge to world health but also has created new economic challenges to businesses worldwide. These challenges have resulted in an increased need for tools that quickly deliver insights to business leaders—empowering informed decisions. This is where Microsoft Azure Synapse Analytics can help.
New circumstances demand new solutions
Azure Synapse Analytics is a new type of analytics platform that enables you to accelerate your time-to-insight with a unified experience and—just as important—save on costs while doing so. It is up to 14 times faster and costs 94 percent less than other cloud providers. Let’s dive into how Azure Synapse can help you respond, adapt, and save.
Respond to disruption and adapt to a new normal
History shows that proven analytics technologies,
This post was co-authored by Jie Feng Principal Program Manager, and Sumi Venkitaraman Senior Product Manager, Microsoft Azure.
Microsoft Azure Data Share is an open, easy, and secure way to share data at scale by enabling organizations to share data in-place or as a data snapshot. Microsoft Azure Data Explorer is a fast and highly scalable data analytics service for telemetry, time-series, and log data.
Fueled by digital transformation, modern organizations want to increasingly enable fluid data sharing to drive business decisions. Seamlessly sharing data for inter-departmental and inter-organizational collaboration can unlock tremendous competitive advantage. Maintaining control and visibility, however, remains an elusive goal. Even today, data is shared using File Transfer Protocols (FTPs), application programming interfaces (APIs), USB devices, and email attachments. These methods are simply not secure, cannot be governed, and are inefficient at best.
Azure Data Share in-place Sharing for Azure Data Explorer, now generally available, enables you to share big data easily and securely between internal departments and with external partners, vendors, or customers for near real-time collaboration.
Once data providers share data, recipients (data consumers) always have the latest data without needing any additional intervention. Additionally, data providers maintain control over the sharing and
More than ever before, companies are relying on their big data and artificial intelligence (AI) systems to find new ways to reduce costs and accelerate decision-making. However, customers using on-premises systems struggle to realize these benefits due to administrative complexity, inability to scale their fixed infrastructure cost-effectively, and lack of a shared collaborative environment for data engineers, data scientists and developers.
To make it easier for customers to modernize their on-premises Spark and big data workloads to the cloud, we’re announcing a new migration offer with Azure Databricks. The offer includes:
Up to a 52 percent discount over the pay-as-you-go pricing when using the Azure Databricks Unit pre-purchase plans. This means that customers can free themselves from the complexities and constraints of their on-premises solutions and realize the benefits of the fully managed Azure Databricks service at a significant discount. Free migration assessment for qualified customers.
Azure Databricks is a fast, easy, and collaborative Apache Spark-based service that simplifies building big data and AI solutions. Since its debut two years ago, Azure Databricks has experienced significant adoption from customers, such as Shell, Cerner, Advocate Aurora Health, and Bosch, which are using it to run mission-critical big data and AI workloads.
This post is part 2 of a two-part series about how organizations use Azure Cosmos DB to meet real world needs, and the difference it’s making to them. In part 1, we explored the challenges that led the Microsoft Office Licensing Service team to move from Azure Table storage to Azure Cosmos DB, and how it migrated its production workload to the new service. In part 2, we examine the outcomes resulting from the team’s efforts.
Strong benefits with minimal effort
The Microsoft Office Licensing Service (OLS) team’s migration from Azure Table storage to Azure Cosmos DB was simple and straightforward, enabling the team to meet all its needs with minimal effort.
An easy migration
In moving to Azure Cosmos DB, thanks to its Table API, the OLS team was able to reuse most of its data access code, and the migration engine they wrote to avoid any downtime was fast and easy to build.
Danny Cheng, a software engineer at Microsoft, who leads the OLS development team explains:
“The migration engine was the only real ‘new code’ we had to write. And the code samples for all three parts are publicly available, so it’s not like we had to
This post is part 1 of a two-part series about how organizations use Azure Cosmos DB to meet real world needs, and the difference it’s making to them. In part 1, we explore the challenges that led the Microsoft Office Licensing Service team to move from Azure Table storage to Azure Cosmos DB, and how it migrated its production workload to the new service. In part 2, we examine the outcomes resulting from the team’s efforts.
The challenge: Limited throughput and other capabilities
At Microsoft, the Office Licensing Service (OLS) supports activation of the Microsoft Office client on millions of devices around the world—including Windows, Mac, tablets, and mobile. It stores information such as machine ID, product ID, activation count, expiration date, and more. OLS is accessed by the Office client more than more than 240 million times per day by users around the world, with the first call coming from the client upon license activation and then every 2-3 days thereafter as the client checks to make sure the license is still valid.
Until recently, OLS relied on Azure Table storage for its backend data store, which contained about 5 TB of data spread across 18 tables—with separate tables
This post is part 1 of a two-part series about how organizations use Azure Cosmos DB to meet real world needs and the difference it’s making to them. In part 1, we explore the challenges that led service developers for Minecraft Earth to choose Azure Cosmos DB and how they’re using it to capture almost every action taken by every player around the globe—with ultra-low latency. In part 2, we examine the solution’s workload and how Minecraft Earth service developers have benefited from building it on Azure Cosmos DB.
Extending the world of Minecraft into our real world
You’ve probably heard of the game Minecraft, even if you haven’t played it yourself. It’s the best-selling video game of all time, having sold more than 176 million copies since 2011. Today, Minecraft has more than 112 million monthly players, who can discover and collect raw materials, craft tools, and build structures or earthworks in the game’s immersive, procedurally generated 3D world. Depending on game mode, players can also fight computer-controlled foes and cooperate with—or compete against—other players.
In May 2019, Microsoft announced the upcoming release of Minecraft Earth, which began its worldwide rollout in December 2019. Unlike preceding games in the
This post is part 2 of a two-part series about out how organizations are using Azure Cosmos DB to meet real world needs and the difference it’s making to them. In part 1, we explored the challenges that led service developers for Minecraft Earth to choose Azure Cosmos DB and how they’re using it to capture almost every action taken by every player around the globe—with ultra-low latency. In part 2 we examine the solution’s workload and how Minecraft Earth service developers have benefited from building it on Azure Cosmos DB.
Geographic distribution and multi-region writes
Minecraft Earth service developers used the turnkey geographic distribution feature in Azure Cosmos DB to achieve three goals: fault tolerance, disaster recovery, and minimal latency—the latter achieved by also using the multi-master capabilities of Azure Cosmos DB to enable multi-region writes. Each supported geography has at least two service instances. For example, in North America, the Minecraft Earth service runs in the West US and East US Azure regions, with other components of Azure used to determine which is closer to the user and route traffic accordingly.
Nathan Sosnovske, a Senior Software Engineer on the Minecraft Earth services development team explains:
“With Azure available