Azure offers a wide variety of virtual machine (VM) sizes tailored to meet diverse customer needs. Our NV size family has been optimized for GPU-powered visualization workloads, such as CAD, gaming, and simulation. Today, our customers are using these VMs to power remote visualization services and virtual desktops in the cloud. While our existing NV size VMs work great to run graphics heavy visualization workloads, a common piece of feedback we receive from our customers is that for entry-level desktops in the cloud, only a fraction of the GPU resources is needed. Currently, the smallest sized GPU VM comes with one full GPU and more vCPU/RAM than a knowledge worker desktop requires in the cloud. For some customers, this is not a cost-effective configuration for entry-level scenarios.
Announcing NVv4 Azure Virtual Machines based on AMD EPYC 7002 processors and virtualized Radeon MI25 GPU.
The new NVv4 virtual machine series will be available for preview in the fall. NVv4 offers unprecedented GPU resourcing flexibility, giving customers more choice than ever before. Customers can select from VMs with a whole GPU all the way down to 1/8th of a GPU. This makes entry-level and low-intensity GPU workloads more cost-effective than ever before,
https://powerbi.microsoft.com/en-us/blog/connecting-azure-analysis-services-to-azure-data-lake-storage-gen2/Source: https://powerbi.microsoft.com/en-us/blog/connecting-azure-analysis-services-to-azure-data-lake-storage-gen2/ With the public preview available for “Multi-Protocol Access” on Azure Data Lake Storage Gen2, now AAS can use the Blob API to access files in ADLSg2. Check how to do it!
Microsoft is committed to giving our customers industry-leading performance for all their workloads. After being the first global cloud provider to announce the deployment of AMD EPYC™ based Azure Virtual Machines in 2017, we’ve been working together to continue bringing the latest innovation to enterprises.
Today, we are announcing our second-generation HB-series Azure Virtual Machines, HBv2, which features the latest AMD EPYC 7002 processor. Customers will be able to increase HPC performance and scalability to run materially larger workloads on Azure. We’ll also be bringing the AMD 7002 processors and Radeon Instinct GPUs to our family of cloud-based virtual desktops. Finally, our new Dav3 and Eav3-series Azure Virtual Machines, in preview today, provide more customer choice to meet a broad range of requirements for general purpose workloads using the new AMD EPYC™ 7452 processor.
Our growing Azure HPC offerings
Customers are choosing our Azure HPC offerings (HB-series) incorporating first generation AMD EPYC Naples for their performance and scalability. We’ve seen a 33 percent memory bandwidth advantage with EPYC, and that’s a key factor for many of our customers’ HPC workloads. For example, fluid dynamics is one workload in which this advantage is valuable. Azure has an increasing number of customers
We are making it easier for customers to “lift and shift” applications to the cloud while maintaining the same security model used on-premises with the general availability of Azure Active Directory Domain Services (Azure AD DS) authentication for Azure Files. By integrating Azure AD DS, you can mount your Azure file share over SMB using Azure Active Directory (Azure AD) credentials from Azure AD DS domain joined Windows virtual machines (VMs) with NTFS access control lists (ACLs) enforced.
Azure AD DS authentication for Azure Files allows users to specify granular permissions on shares, files, and folders. It unblocks common use cases like single writer and multi-reader scenario for your line of business applications. As the file permission assignment and enforcement experience matches that of NTFS, lifting and shifting your application into Azure is as easy as moving it to a new SMB file server. This also makes Azure Files an ideal shared storage solution for cloud-based services. For example, Windows Virtual Desktop recommends using Azure Files to host different user profiles and leverage Azure AD DS authentication for access control.
Since Azure Files strictly enforces NTFS discretionary access control lists (DACLs), you can use familiar tools like Robocopy to
Choosing Azure for your applications and services allows you take advantage of a wide array of security tools and capabilities. These tools and capabilities help make it possible to create secure solutions on Azure. Among these capabilities is Azure disk encryption, designed to help protect and safeguard your data to meet your organizational security and compliance commitments. It uses the industry standard BitLocker Drive Encryption for Windows and DM-Crypt for Linux to provide volume encryption for OS and data disks. The solution is integrated with Azure Key Vault to help you control and manage disk encryption keys and secrets, and ensures that all data on virtual machine (VM) disks are encrypted both in-transit and at rest while in Azure Storage.
Beyond securing your applications, it is important to have a disaster recovery plan in place to keep your mission critical applications up and running when planned and unplanned outages occur. Azure Site Recovery helps orchestrate replication, failover, and recovery of applications running on Azure Virtual Machines so that they are available from a secondary region if you have any outages in the primary region.
Azure Site Recovery now supports disaster recovery of Azure disk encryption (V2) enabled virtual machines without
High availability is crucial to mission-critical production environments. The Red Hat Enterprise Linux High Availability Add-On provides reliability and availability to critical production services that use it. Today, we’re sharing performance improvements and image updates around the High Availability Add-On for Red Hat Enterprise Linux (RHEL) on Azure.
Pacemaker is a robust and powerful open-source resource manager used in highly available compute clusters. It is a key part of the High Availability Add-On for RHEL.
Pacemaker has been updated with performance improvements in the Azure Fencing Agent to significantly decrease Azure failover time, which greatly reduces customer downtime. This update is available to all RHEL 7.4+ users using either the Pay-As-You-Go images or Bring-Your-Own-Subscription images from the Azure Marketplace.
New pay-as-you-go RHEL images with the High Availability Add-On
We now have RHEL Pay-As-You-Go (PAYG) images with the High Availability Add-On available in the Azure Marketplace. These RHEL images have additional access to the High Availability Add-On repositories. Pricing details for these images are available in the pricing calculator.
The following RHEL HA PAYG images are now available in the Marketplace for all Azure regions, including US Government Cloud:
If you’re experiencing problems with your applications, a great place to start investigating solutions is through your Azure Service Health dashboard. In this blog post, we’ll explore the differences between the Azure status page and Azure Service Health. We’ll also show you how to get started with Service Health alerts so you can stay better informed about service issues and take action to improve your workloads’ availability.
How and when to use the Azure status page
The Azure status page works best for tracking major outages, especially if you’re unable to log into the Azure portal or access Azure Service Health. Many Azure users visit the status page regularly. It predates Azure Service Health and has a friendly format that shows the status of all Azure services and regions at a glance.
The Azure status page, however, doesn’t show all information about the health of your Azure services and regions. The status page isn’t personalized, so you need to know exactly which services and regions you’re using and locate them in the grid. The status page also doesn’t include information about non-outage events that could affect your availability. For example, planned maintenance events and health advisories (think service retirements
Azure Databricks is a fast, easy, and collaborative Apache Spark based analytics platform that simplifies the process of building big data and artificial intelligence (AI) solutions. Azure Databricks provides data engineers and data scientists an interactive workplace where they can use the languages and frameworks of their choice. Natively integrated with services like Azure Machine Learning and Azure SQL Data Warehouse, Azure Databricks enables customers to build an end-to-end modern data warehouse, real-time analytics, and machine learning solutions.
Save up to 37 percent on your Azure Databricks workloads
Azure Databricks Unit pre-purchase plan is now generally available—expanding our commitment to make Azure the most cost-effective cloud for running your analytics workloads.
Today, with the Azure Databricks Unit pre-purchase plan, you can start unlocking the benefits of Azure Databricks at significantly reduced costs when you pre-pay for Databricks compute for a one or three-year term. With this new pricing option, you can achieve savings of up to 37 percent compared to pay-as-you-go pricing. You can learn more about the discount tiers on our pricing page. All Azure Databricks SKUs—Premium and Standard SKUs for Data Engineering Light, Data Engineering, and Data Analytics—are eligible for DBU pre-purchase.
Compared with other Azure services with
Today, we are announcing the most comprehensive and compelling migration offer available in the industry to help customers simplify their cloud analytics journey.
This collaboration between Microsoft and Informatica provides customers an accelerated path for their digital transformation. As customers modernize their analytics systems, it enables them to truly begin integrating emerging technologies, such as AI and machine learning, into their business. Without migrating analytics workloads to the cloud, it becomes difficult for customers to maximize the potential their data holds.
For customers that have been tuning analytics appliances for years, such as Teradata and Netezza, it can seem overwhelming to start the journey towards the cloud. Customers have invested valuable time, skills, and personnel to achieve optimal performance from their analytics systems, which contain the most sensitive and valuable data for their business. We understand that the idea of migrating these systems to the cloud can seem risky and daunting. This is why we are partnering with Informatica to help customers begin their cloud analytics journey today with an industry-leading offer.
With this offering, customers can now work with Azure and Informatica to easily understand their current data estate, determine what data is connected to their current
https://powerbi.microsoft.com/en-us/blog/help-improve-power-bi-copy-vs-export-survey/Source: https://powerbi.microsoft.com/en-us/blog/help-improve-power-bi-copy-vs-export-survey/ Take our survey to help us continue to improve copy vs. export related scenarios in Power BI.