If data-access challenges have been keeping you from running high-performance computing (HPC) jobs in Azure, we’ve got great news to report! The now-available Microsoft Azure HPC Cache service lets you run your most demanding workloads in Azure without the time and cost of rewriting applications and while storing data where you want to—in Azure or on your on-premises storage. By minimizing latency between compute and storage, the HPC Cache service seamlessly delivers the high-speed data access required to run your HPC applications in Azure.
Use Azure to expand analytic capacity—without worrying about data access
Most HPC teams recognize the potential for cloud bursting to expand analytic capacity. While many organizations would benefit from the capacity and scale advantages of running compute jobs in the cloud, users have been held back by the size of their datasets and the complexity of providing access to those datasets, typically stored on long-deployed network-attached storage (NAS) assets. These NAS environments often hold petabytes of data collected over a long period of time and represent significant infrastructure investment.
Here’s where the HPC Cache service can help. Think of the service as an edge cache that provides low-latency access to POSIX file data sourced from one
https://azure.microsoft.com/blog/democratizing-agriculture-intelligence-introducing-azure-farmbeats/For an industry that started 12,000 years ago, there is a lot of unpredictability and imprecision in agriculture. To be predictable and precise, we need to align our actions with insights gathered from data. Last week at Microsoft Ignite, we READ MORE
Today, more and more organizations are focused on delivering new digital solutions to customers and finding that the need for increased agility, improved processes, and collaboration between development and operation teams is becoming business-critical. For over a decade, DevOps has been the answer to these challenges. Understanding the need for DevOps is one thing, but the actual adoption of DevOps in the real world is a whole other challenge. How can an organization with multiple teams and projects, with deeply rooted existing processes, and with considerable legacy software change its ways and embrace DevOps?
At Microsoft, we know something about these challenges. As a company that has been building software for decades, Microsoft consists of thousands of engineers around the world that deliver many different products. From Office, to Azure, to Xbox we also found we needed to adapt to a new way of delivering software. The new era of the cloud unlocks tremendous potential for innovation to meet our customers’ growing demand for richer and better experiences—while our competition is not slowing down. The need to accelerate innovation and to transform how we work is real and urgent.
The road to transformation is not easy and we believe that
https://azure.microsoft.com/blog/10-user-experience-updates-to-the-azure-portal/We’re constantly working to improve your user experience in the Azure portal. Our goal is to offer you a productive and easy-to-use single-pane-of glass where you can build, manage, and monitor your Azure services, applications, and infrastructure. In this post, READ MORE
https://azure.microsoft.com/blog/azure-sql-data-warehouse-is-now-azure-synapse-analytics/On November fourth, we announced Azure Synapse Analytics, the next evolution of Azure SQL Data Warehouse. Azure Synapse is a limitless analytics service that brings together enterprise data warehousing and Big Data analytics. It gives you the freedom to query READ MORE
Over the past few years, we have seen many examples of organizations applying conversational AI in meaningful ways. Accenture and Caesars Entertainment are making their employees more productive with enterprise bots. UPS and Asiana Airlines are using bots to deliver better customer service. And finally, BMW and LaLiga have built their own branded voice assistants, taking control of how customers experience their brand. These are just a few of the organizations that have built conversational AI solutions with Azure AI.
This week at Microsoft Ignite, we announced updates to our products to make it easier for organizations to build robust conversational solutions, and to deploy them wherever their customers are. We are sharing some of the highlights below.
Most popular open source SDK for accelerated bot development
We announced the release of Bot Framework SDK 4.6 making it easier for developers to build enterprise-grade conversational AI experiences. Bot Framework includes a set of open source SDKs and tools for bot development, and can easily integrate with Azure Cognitive Services, enabling developers to build bots that can speak to, listen to, and understand users.
Bot Framework SDK for Microsoft Teams. Developers can build Teams bots with built-in support for Teams messaging
At Microsoft Ignite 2018, we shared our vision to bring together infrastructure, application, and network monitoring into one unified offering, and provide full-stack monitoring for your applications. We have since made rapid strides towards delivering that reality to our customers. From consolidating our logs, metrics and alerts platforms, and integrating existing capabilities such as Application Insights and Log Analytics, to adding new monitoring capability containers and virtual machines, and contributing back to the community through open-source projects such as OpenTelemetry. In this blog, I’ll share the newest enhancements from Azure Monitor at Microsoft Ignite, including four examples of how we continue to build seamless, and integrated monitoring solution that works well for cloud-native and legacy workloads and is cost-effective. Be sure to read the full blog post to get a list of all the exciting enhancements.
Monitor containers anywhere
Customers love the convenience of the out of the box monitoring that Azure Monitor for containers provides for all their Azure Kubernetes Service (AKS) clusters. But, you also have Kubernetes clusters running outside AKS. For customers who have hybrid environments, we are now launching the ability to monitor Kubernetes clusters on-premises and on Azure Stack (with AKS Engine) in preview. Just
This blog post was co-authored by Jeremy Winter, Partner Director and Tanuj Bansal, Senior Director for Microsoft Azure.
At last year’s Microsoft Ignite 2018, we shared best practices on how to move to the cloud and why Azure is the best destination for all your apps, data, and infrastructure. Since then, we’re happy to share that a number of customers have joined us on Azure—H&R Block, Albertsons, Devon Energy, and Carlsberg Group, just to name a few. Azure has helped these customers drive innovation, enhance their security posture, and reduce costs with unique offers such as Azure Hybrid Benefit.
At this week’s Microsoft Ignite event in Orlando, we shared the approach these customers took and more news in Azure migration sessions and one-on-one architecture review sessions with Azure engineers.
In this blog, we want to share some of the exciting news we shared at Microsoft Ignite.
Accelerating customer success: Azure Migration Program (AMP)
Since its launch in July, AMP has seen an enthusiastic reception with more than a thousand customers entering the program for migration projects ranging across Windows Server, SQL Server, and Linux workloads. To recap, AMP offers customers:
Technical skill building with foundational, workload, migration, and role-specific courses
This post is co-authored by Tina Coll, Senior Product Marketing Manager, Azure Cognitive Services and Anny Dow, Product Marketing Manager, Azure Cognitive Services.
Azure Cognitive Services brings artificial intelligence (AI) within reach of every developer without requiring machine learning expertise. All it takes is an API call to embed the ability to see, hear, speak, understand, and accelerate decision-making into your apps. Enterprises have taken these pre-built and custom AI capabilities to deliver more engaging and personalized intelligent experiences. We’re continuing the momentum from Microsoft Build 2019 by making Personalizer generally available, and introducing additional advanced capabilities in Vision, Speech, and Language categories. With many advancements to share, let’s dive right in.
Personalizer: Powering rich user experiences
Winner of this year’s ‘Most Innovative Product’ award at O’Reilly’s Strata Conference, Personalizer is the only AI service on the market that makes reinforcement learning available at-scale through easy-to-use APIs. Personalizer is powered by reinforcement learning and provides developers a way to create rich, personalized experiences for users, even if they do not necessarily have deep machine learning expertise.
Giving customers what they want at any given moment is one of the biggest challenges faced by retail, media, and e-commerce businesses today. Whether it’s
APIs are everywhere. The broad proliferation of applications throughout enterprises often results in large silos of opaque processes and services, making it hard for IT to manage and govern APIs in a systematic way, and for development teams to gain visibility into and make use of APIs that already exist.
Entire industries, such as financial services, are embracing APIs as a means to become more open, for example with open banking initiatives. Open banking is an API-first approach to creating more open, rich ecosystems that encourage third-party participation and usage of the services financial institutions have previously kept behind the scenes.
Products, such as Azure API Management, were created to address these issues. By letting you manage all APIs in a single, centralized location, you are able to impose authentication, authorization, throttling, and transformation policies and easily monitor the usage of the APIs associated with your applications, giving you the much-needed visibility into your application portfolio(s) at a macro-level.
To succeed in an increasingly connected world, it is key to adopt an API-first approach that lets you:
Embrace innovation by creating vibrant API ecosystems. Secure and manage APIs seamlessly in a hybrid world.
APIs can be a bridge to the