25

Nov

Learn How Power BI and Accessibility Work Together
Learn How Power BI and Accessibility Work Together

https://powerbi.microsoft.com/en-us/blog/learn-how-power-bi-and-accessibility-work-together/Source: https://powerbi.microsoft.com/en-us/blog/learn-how-power-bi-and-accessibility-work-together/           Power BI’s accessibility documentation has been expanded from one article to five new articles for you to explore. We hope that these new articles will help you learn more about how Power BI READ MORE

Share

25

Nov

Analytics in Azure Virtual Event
Analytics in Azure Virtual Event

https://powerbi.microsoft.com/en-us/blog/analytics-in-azure-virtual-event/Source: https://powerbi.microsoft.com/en-us/blog/analytics-in-azure-virtual-event/           The next wave of analytics is here and the Power BI team is excited to be an integral partner with the recently launched Azure Synapse Analytics.

Share

21

Nov

Announcing: Upgrade your classic workspaces to the new workspace experience public preview is rolling out

https://powerbi.microsoft.com/en-us/blog/announcing-upgrade-your-classic-workspaces-to-the-new-workspace-experience-public-preview-is-rolling-out/Source: https://powerbi.microsoft.com/en-us/blog/announcing-upgrade-your-classic-workspaces-to-the-new-workspace-experience-public-preview-is-rolling-out/           We’ve started to roll-out the capability upgrade classic workspace to the new workspace experience as a Public Preview. This post gives you resources to learn about the upgrade before it becomes available to READ MORE

Share

21

Nov

https://azure.microsoft.com/blog/azure-backup-support-for-sql-server-2019-and-restore-as-files/As SQL Server 2019 continues to push the boundaries of availability, performance, and data intelligence, a centrally managed, enterprise-scale backup solution is imperative to ensure the protection of all that data. This is especially true if you are running the READ MORE

Share

21

Nov

Change feed support now available in preview for Azure Blob Storage

Change feed support for Microsoft Azure Blob storage is now available in preview. Change feed provides a guaranteed, ordered, durable, read-only log of all the creation, modification, and deletion change events that occur to the blobs in your storage account. This log is stored as append blobs within your storage account, therefore you can manage the data retention and access control based on your requirements.

Change feed is the ideal solution for bulk handling of large volumes of blob changes in your storage account, as opposed to periodically listing and manually comparing for changes. It enables cost-efficient recording and processing by providing programmatic access such that event-driven applications can simply consume the change feed log and process change events from the last checkpoint.

Some scenarios that would benefit from consuming a blob change feed include:

Bulk processing a group of newly uploaded files for virus scanning, resizing, or backups. Storing, auditing, and analyzing changes to your objects over any period of time for data management or compliance. Combining data uploaded by various IoT sensors into a single collection for data transformation and insights. Additional data movement by synchronizing with a cache, search engine, or data warehouse. How to get started

Share

20

Nov

How AI can supercharge content understanding for businesses

Organizations face challenges when it comes to extracting insights, finding meaning, and uncovering new opportunities in the vast troves of content at their disposal. In fact, 82 percent of organizations surveyed in the latest Harvard Business Review (HBR) Analytic Services report say that exploring and understanding their content in a timely manner is a significant challenge. This is exacerbated because content is not only spread over multiple systems but also in multiple formats such as PDF, JPEG, spreadsheets, and audio files.

The first wave of artificial intelligence (AI) was designed for narrow applications, training a single model to address a specific task such as handwriting recognition. What’s been challenging, however, is that these models individually can’t capture all the different attributes hidden in various types of content. This means developers must painfully stitch together disparate components to fully understand their content.

Instead, organizations need a solution that spans vision, speech, and language to fully unlock insights from all content types. We are heavily investing in this new category of AI, called knowledge mining, to enable organizations to maximize the value of their content.

Knowledge mining with Azure Cognitive Search

Organizations can take advantage of knowledge mining today with Azure Cognitive

Share

19

Nov

How to deploy SQL Server 2019 Big Data Clusters
How to deploy SQL Server 2019 Big Data Clusters

SQL Server 2019 Big Data Clusters is a scale-out, data virtualization platform built on top of the Kubernetes container platform. This ensures a predictable, fast, and elastically scalable deployment, regardless of where it’s deployed. In this blog post, we’ll explain how to deploy SQL Server 2019 Big Data Clusters to Kubernetes.

First, the tools

Deploying Big Data Clusters to Kubernetes requires a specific set of client tools. Before you get started, please install the following:

azdata:Deploys and manages Big Data Clusters.kubectl: Creates and manages the underlying Kubernetes cluster.Azure Data Studio:Graphical interface for using Big Data Clusters.SQL Server 2019 extension:Azure Data Studio extension that enables the Big Data Clusters features.Choose your Kubernetes

Big Data Clusters is deployed as a series of interrelated containers that are managed in Kubernetes. You have several options for hosting Kubernetes, depending on your use case, including:

Azure Kubernetes Service (AKS): You can use the Azure portal to deploy Azure Kubernetes Service. Azure Kubernetes Service allows you to deploy a managed Kubernetes cluster in Azure, all you manage and maintain are the agent nodes. You don’t even have to provision your own hardware.Multiple Linux machines: Kubernetes can also be deployed to multiple Linux machines, physical or virtual. This is a great option

Share

19

Nov

Azure high-performace computing at SC’19

HBv2 Virtual Machines for HPC, Azure’s most powerful yet, now in preview Azure HB v2-series Virtual Machines (VM) for high-performance computing (HPC) are now in preview in the South Central US region. HBv2-series Virtual Machines are Azure’s most advanced HPC READ MORE

Share

19

Nov

Azure high-performace computing at SC’19
Azure high-performace computing at SC’19

HBv2 Virtual Machines for HPC, Azure’s most powerful yet, now in preview

Azure HB v2-series Virtual Machines (VM) for high-performance computing (HPC) are now in preview in the South Central US region.

HBv2-series Virtual Machines are Azure’s most advanced HPC offering yet, featuring performance and Message Passing Interface scalability rivaling the most advanced supercomputers on the planet, and price and performance on par with on-premises HPC deployments.

HBv2 Virtual Machines are designed for a variety of real-world HPC applications, from fluid dynamics to finite element analysis, molecular dynamics, seismic processing & imaging, weather modeling, rendering, computational chemistry, and more.

Each HBv2 Virtual Machines features 120 AMD EPYCTM 7742 processor cores at 2.45 GHz (3.3 GHz Boost), 480 GB of RAM, 480 MB of L3 cache, and no simultaneous multithreading. A HBv2 Virtual Machine also provides up to 340 GB per second of memory bandwidth, up to four teraflops of double-precision compute, and up to eight teraflops of single-precision compute.

Finally, a HBv2 Virtual Machine features 900 GB of low-latency, high-bandwidth block storage via NVMeDirect, and supports up to eight Azure Managed Disks.

200 Gigabits high data rate (HDR) InfiniBand comes to the Azure

HBv2-series Virtual Machines feature one of the

Share

19

Nov

Bringing confidential computing to Kubernetes

https://azure.microsoft.com/blog/bringing-confidential-computing-to-kubernetes/Historically, data has been protected at rest through encryption in data stores, and in transit using network technologies, however as soon as that data is processed in the CPU of a computer it is decrypted and in plain text. New READ MORE

Share