Climate experts across the globe agree: if we can’t drastically reduce carbon emissions, our planet will face catastrophic consequences. Microsoft has operated carbon neutral since 2012, and in January 2020 Brad Smith announced our commitment to going carbon negative by 2030. This isn’t a goal we can reach in one easy swoop—it will take time, dedication, and many small steps that coalesce into something greater.
As the cloud business grows, our datacenter footprint grows. In our journey toward carbon negative, Microsoft is taking steps to roll back the effect datacenters have on the environment. Reaching this goal will take many steps, along with the implementation of innovative technologies that have yet to be developed.
Many companies are reaching for net zero emissions, but we’re taking it even further. We’re not just reducing our output to zero. We’re committed to reducing our emissions by half, and then removing the carbon we’ve emitted since 1975, to truly go carbon negative.
The journey to carbon negative
A big part of going carbon negative means completely changing the way datacenters operate. Datacenters have adopted some sustainable methods around cooling, including open-air and adiabatic cooling. These methods have helped to drastically reduce the water and
As the world adjusts to new ways of working and staying connected, we remain committed to providing Azure AI solutions to help organizations invent with purpose.
Building on our vision to empower all developers to use AI to achieve more, today we’re excited to announce expanded capabilities within Azure Cognitive Services, including:.
Text Analytics for health preview. Form Recognizer general availability. Custom Commands general availability. New Neural Text to Speech voices.
Companies in healthcare, insurance, sustainable farming, and other fields continue to choose Azure AI to build and deploy AI applications to transform their businesses. According to IDC1, by 2022, 75 percent of enterprises will deploy AI-based solutions to improve operational efficiencies and deliver enhanced customer experiences.
To meet this growing demand, today’s product updates expand on existing language, vision, and speech capabilities in Azure Cognitive Services to help developers build mission-critical AI apps that enable richer insights, save time and reduce costs, and improve customer engagement.
Get rich insights with powerful natural language processing
One of the ways organizations are adapting is scaling the ability to rapidly process data and generate new insights from data. COVID-19 has accelerated the urgency, particularly for the healthcare industry. With the overwhelming amount
“In the era of big data, insights collected from cloud services running at the scale of Azure quickly exceed the attention span of humans. It’s critical to identify the right steps to maintain the highest possible quality of service based on the large volume of data collected. In applying this to Azure, we envision infusing AI into our cloud platform and DevOps process, becoming AIOps, to enable the Azure platform to become more self-adaptive, resilient, and efficient. AIOps will also support our engineers to take the right actions more effectively and in a timely manner to continue improving service quality and delighting our customers and partners. This post continues our Advancing Reliability series highlighting initiatives underway to keep improving the reliability of the Azure platform. The post that follows was written by Jian Zhang, our Program Manager overseeing these efforts, as she shares our vision for AIOps, and highlights areas of this AI infusion that are already a reality as part of our end-to-end cloud service management.”—Mark Russinovich, CTO, Azure
This post includes contributions from Principal Data Scientist Manager Yingnong Dang and Partner Group Software Engineering Manager Murali Chintalapati.
As Mark mentioned when he launched this Advancing Reliability blog
Machine learning (ML) is gaining momentum across a number of industries and scenarios as enterprises look to drive innovation, increase efficiency, and reduce costs. Microsoft Azure Machine Learning empowers developers and data scientists with enterprise-grade capabilities to accelerate the ML lifecycle. At Microsoft Build 2020, we announced several advances to Azure Machine Learning across the following areas: ML for all skills, Enterprise grade MLOps, and responsible ML.
ML for all skills
New enhancements provide ML access for all skills.
Enhanced notebook in preview
Data scientists and developers can now access an enhanced notebook editor directly inside Azure Machine Learning studio. New capabilities to create, edit, and collaborate make remote work and sharing easier for data science teams and the notebook is fully compatible with Jupyter.
Boost development productivity with features like IntelliSense, inline error highlighting, and code suggestions from VSCode, which deliver the best-in-class coding experience in Jupyter notebooks. Access real-time co-editing (coming soon) for seamless remote collaboration or pair debugging. Inline controls to start, stop, and create a new compute using GPU or CPU Compute Instance inside notebooks. Add new kernels to the notebook editor and quickly switch between different kernels like Python and R.
Real-time notebook co-editing
It’s inspiring to see how customers continue to reimagine how they work with the help of AI, which is more important today than ever. Our customers are finding innovative ways to deliver crisis management solutions, drive cost-savings, redefine customer engagement, and accelerate decision-making.
Here are some notable examples we’ve recently seen:
Scaling crisis management
On the frontlines, first responders rely on Azure AI to scale their triage process to address the overwhelming number of people needing care and to ease volume in the system. For example, healthcare providers have created more than 1,400 bots using our Healthcare Bot service, helping more than 27 million people access critical healthcare information. The U.S. Centers for Disease Control and Prevention released a COVID-19 assessment bot that is powered by Azure Bot Service. Motorola Solutions uses Azure Bot Service, as well as speech and language services, in its own voice assistant for public safety, ViQi, to help 911 dispatchers and first responders focus on what matters most.
Azure AI is also helping customers optimize their operations to reduce costs. KPMG built a risk and fraud analytics solution using our speech and language services to streamline call center transcription and translation—cutting time,
Digital transformation in manufacturing has the potential to increase annual global economic value by $4.5 trillion according to the IDC MarketScape.iWith so much upside, manufacturers are looking at how technologies like IoT, machine learning, and artificial intelligence (AI) can be used to optimize supply chains, improve factory performance, accelerate product innovation, and enhance service offerings.
Digital transformation starts by collecting data from machines on the plant floor, assets in the supply chain, or products being used by customers. This data can be combined with other business data and then modeled and analyzed to gain actionable insights.
Let’s take a look at three manufacturers—Festo, Kao, and AkzoNobel—and see how each one is using technologies like IoT, machine learning, and AI to accelerate their digital transformation.
Providing predictive maintenance as a service
Based in Germany, Festo sells electric and pneumatic drive solutions to 300,000 customers in 176 countries. The company’s goal is to increase uptime for customers by providing predictive maintenance offerings as software as a service (SaaS) offerings. Festo’s strategy is to connect machines to the cloud with Azure IoT and then enable customers to visualize data along the entire value chain.
One of the first SaaS offerings is Festo
Incremental enrichment is a new feature of Azure Cognitive Search that brings a declarative approach to indexing your data. When incremental enrichment is turned on, document enrichment is performed at the least cost, even as your skills continue to evolve. Indexers in Azure Cognitive Search add documents to your search index from a data source. Indexers track updates to the documents in your data sources and update the index with the new or updated documents from the data source.
Incremental enrichment is a new feature that extends change tracking from document changes in the data source to all aspects of the enrichment pipeline. With incremental enrichment, the indexer will drive your documents to eventual consistency with your data source, the current version of your skillset, and the indexer.
Indexers have a few key characteristics:
Data source specific. State aware. Can be configured to drive eventual consistency between your data source and index.
In the past, editing your skillset by adding, deleting, or updating skills left you with a sub-optimal choice. Either rerun all the skills on the entire corpus, essentially a reset on your indexer, or tolerate version drift where documents in your index are enriched with different versions of
https://azure.microsoft.com/blog/extending-the-power-of-azure-ai-to-microsoft-365-users/Today, Yusuf Mehdi, Corporate Vice President of Modern Life and Devices, announced the availability of new Microsoft 365 Personal and Family subscriptions. In his blog, he shared a few examples of how Microsoft 365 is innovating to deliver experiences powered READ MORE
We’re expanding the Microsoft Azure Stack Edge with NVIDIA T4 Tensor Core GPU preview during the GPU Technology Conference (GTC Digital). Azure Stack Edge is a cloud-managed appliance that brings Azure’s compute, storage, and machine learning capabilities to the edge for fast local analysis and insights. With the included NVIDIA GPU, you can bring hardware acceleration to a diverse set of machine learning (ML) workloads.
What’s new with Azure Stack Edge
At Mobile World Congress in November 2019, we announced a preview of the NVIDIA GPU version of Azure Stack Edge and we’ve seen incredible interest in the months that followed. Customers in industries including retail, manufacturing, and public safety are using Azure Stack Edge to bring Azure capabilities into the physical world and unlock scenarios such as the real-time processing of video powered by Azure Machine Learning.
These past few months, we’ve taken our customers’ feedback to make key improvements and are excited to make our preview available to even more customers today.
Azure Machine Learning: Build and train your model in the cloud, then deploy it to the edge for FPGA or
The world of supercomputing is evolving. Work once limited to high-performance computing (HPC) on-premises clusters and traditional HPC scenarios, is now being performed at the edge, on-premises, in the cloud, and everywhere in between. Whether it’s a manufacturer running advanced simulations, an energy company optimizing drilling through real-time well monitoring, an architecture firm providing professional virtual graphics workstations to employees who need to work remotely, or a financial services company using AI to navigate market risk, Microsoft’s collaboration with NVIDIA makes access to NVIDIA graphics processing units (GPU) platforms easier than ever.
These modern needs require advanced solutions that were traditionally limited to a few organizations because they were hard to scale and took a long time to deliver. Today, Microsoft Azure delivers HPC capabilities, a comprehensive AI platform, and the Azure Stack family of hybrid and edge offerings that directly address these challenges.
This year during GTC Digital, we’re spotlighting some of the most transformational applications powered by NVIDIA GPU acceleration that highlight our commitment to edge, on-prem, and cloud computing. Registration is free, so sign up to learn how Microsoft is powering transformation.
Visualization and GPU workstations
Azure enables a wide range of visualization workloads, which are critical