This post was authored by the Microsoft Cognitive Services Team.
We understand that the commitments we make about data are essential for any organization. To give customers more control, we’ve updated our Cognitive Services terms for customer data. Here’s what this means for our customers.
On February 1, we started moving Cognitive Services under the same terms as other Azure services. Under the new terms, Cognitive Services customers own, and can manage and delete their customer data. With this change, many Cognitive Services are now aligned with the same terms that apply to other Azure services.
Terms for Computer Vision, Face, Content Moderator, Text Analytics, and Speech services have already changed, with updates coming to Language Understanding on March 1 and Microsoft Translator on May 1. As new products are added to Cognitive Services, they will align with the same standards as other Azure services, with the exception of Bing Search Services.
Bing Search Services data will continue to be treated differently than other customer data. For example, we use search queries that you provide to Bing Search Services to improve our search algorithms over time.
We are making these updates because we strive to be transparent in our privacy
This post is authored by Alan Yu, Program Manager, SQL Server.
We are excited to announce the February release of SQL Operations Studio is now available.
SQL Operations Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux. To learn more, visit our GitHub.
SQL Operations Studio was announced for Public Preview on November 15th at Connect(), and this February release is the third major update since the announcement. If you missed it, the January release announcement is available here.
The February release includes several major repo updates and feature releases, including:
Added Auto-Update Installation feature Added Connection Dialog ‘Database’ Drop-down Added functionality for new query tabs keeping active connection Fixed bugs in SQL Editor and auto-completion
For complete updates, refer to the Release Notes.
We want to use the February Insiders builds to test the auto-update feature. The 0.26.2 build will be released as an auto-update to 0.26.1 (assuming there are no issues that require publishing a new build to successfully support auto-update).
To discover updates faster,
This post is authored by Xiaochen Wu, Program Manager, SQL Server.
Azure SQL Data Sync allows users to synchronize data between Azure SQL Databases and SQL Server databases in one-direction or bi-direction. This feature was first introduced in 2012. By that time, people didn’t host a lot of large databases in Azure. Some size limitations were applied when we built the data sync service, including up to 30 databases (five on-premises SQL Server databases) in a single sync group, and up to 500 tables in any database in a sync group.
Today, there are more than two million Azure SQL Databases and the maximum database size is 4TB. But those limitations of data sync are still there. It is mainly because that syncing data is a size of data operation. Without an architectural change, we can’t ensure the service can sustain the heavy load when syncing in a large scale. We are working on some improvements in this area. Some of these limitations will be raised or removed in the future. In this article, we are going to show you how to use data sync to sync data between large number of databases and tables, including some best practices and how to
https://powerbi.microsoft.com/en-us/blog/announcing-visual-studio-team-services-vsts-solution-template/Source: https://powerbi.microsoft.com/en-us/blog/announcing-visual-studio-team-services-vsts-solution-template/ Announcing Visual Studio Team Services (VSTS) Solution Template Blue Margin, Inc., Microsoft Gold Partner in Data Analytics and Data Platform just released a VSTS solution template to provide project…
https://powerbi.microsoft.com/en-us/blog/microsoft-at-gartner-data-and-analytics-summit-2018-in-grapevine-tx/Source: https://powerbi.microsoft.com/en-us/blog/microsoft-at-gartner-data-and-analytics-summit-2018-in-grapevine-tx/ In early March, more than 3,000 data and analytics leaders will gather in Grapevine, Texas for the 2018 Gartner Data and Analytics Summit. Microsoft is excited to sponsor this event and bring some READ MORE
We are excited to share the general availability of ExpressRoute monitoring with Network Performance Monitor (NPM). A few months ago, we announced ExpressRoute Monitor with NPM in public preview. Since then, we’ve seen lots of users monitor their Azure ExpressRoute private peering connections, and working with customers we’ve gathered a lot of great feedback. While we’re not done working to make ExpressRoute monitoring best in class, we’re ready and eager for everyone to get their hands on it. In this post, I’ll take you through some of the capabilities that ExpressRoute Monitor provides. To get started, watch a brief demo video explaining ExpressRoute monitoring capability in Network Performance Monitor.
Monitor connectivity to Azure VNETs, over ExpressRoute
NPM can monitor the packet loss and network latency between your on-premises resources (branch offices, datacenters, and office sites) and Azure VNETs connected through an ExpressRoute. You can setup alerts to get proactively notified whenever the loss or latency crosses the threshold. In addition to viewing the near real-time values and historical trends of the performance data, you can use the network state recorder to go back in time to view particular network state in order to investigate the difficult-to-catch transient issues.
We are happy to announce that Azure Site Recovery (ASR) now provides you the ability to setup Disaster Recovery (DR) for IaaS VMs using managed disks. With this feature, ASR fulfills an important requirement to become an all-encompassing DR solution for all of your production applications hosted on laaS VMs in Azure, including applications hosted on VMs with managed disks.
Managed disks provide several advantages including simplification of storage management and guaranteeing industry-leading durability and availability for disk storage.
When you protect virtual machines on managed disks, Azure Site Recovery creates a replica managed disk in the target region corresponding to each managed disk of your production VM in the primary region. This replica disk acts as the data store for the source disk in the primary region, thus eliminating the need to create and manage multiple storage accounts in the target region to store data for your protected machines.
Let us look at an example of protecting a virtual machine with five managed disks. As shown below in Fig 1 and Fig 2,
You can enable protection for the Virtual machine via the virtual machine experience or through the recovery services vault experience.If you plan to use the virtual
This post was co-authored by Muhammad Raza Khan & Akshay Mehra (both former interns), with Mohamed Abdel-Hady & Debraj GuhaThakurta (Senior Data Scientist Leads) and Wei Guo & Zoran Dzunic (Data Scientists) at Microsoft.
We recently published two real-world scenarios demonstrating how to use Azure Machine Learning alongside the Team Data Science Process (TDSP) to execute AI projects involving Natural Language Processing (NLP) use-cases, namely, for sentiment classification and entity extraction. This blog post provides a summary of these two samples, which are available through public GitHub repositories. The samples use a variety of Azure data platforms, such as Data Science Virtual Machines (DSVMs) to train DNN models for sentiment classification and entity extraction using GPUs, and HDInsight Spark for data processing and word embedding model training at scale. The samples show how domain-specific word embeddings generated using domain-specific and labeled training data sets outperforms generic word embeddings trained on general and unlabeled data, which leads to improved accuracy in classification and entity extraction tasks.
The samples show several capabilities of the Azure ML Workbench including:
Instantiation of Team Data Science Process (TDSP) structure and templates.
Execution of Python scripts on different compute environments.
Run history tracking for Python scripts.
https://powerbi.microsoft.com/en-us/blog/year-in-review-contest-winner-2017/Source: https://powerbi.microsoft.com/en-us/blog/year-in-review-contest-winner-2017/ Announcing the winner of the 2017 Power BI “Year in Review” data storytelling contest. With 37 entries and a close race, we’re happy to announce David Eldersveld, a data visualization and analytics consultant READ MORE
We increasingly see that every enterprise is formulating, if not already executing, a cloud-first strategy for their on-premise enterprise data management to benefit from inherent elasticity, flexibility and performance of a cloud data warehouse like Azure SQL Data Warehouse (Azure SQL DW).
The common challenge for moving Azure SQL DW is the complexity of shifting decades of on-premise data management to the cloud . Over years, enterprises have built complex disparate suites of applications such as point of sales, logistics, analytics, and reporting that communicate with a central database. Many apps can’t simply use any database other than the one they were written for originally.
Microsoft has partnered with Datometry to simplify our customer’s journey to the cloud. Re-platforming from Teradata Data Warehouse to SQL DW can be completed in weeks, not years, and at a fraction of the costs compared to traditional migration. With Datometry’s Adaptive Data Virtualization technology, existing Teradata applications can run instantly and natively on Azure SQL DW without rewriting or redesigning the legacy applications.
In the Case Study discussed in this blog post, a Fortune 100 retailer was looking to move their custom business intelligence application with close to 40 million application queries executed per