The Bio-IT World Conference and Expo kicks off tomorrow in Boston. Microsoft is a platinum sponsor of this event. The research and product development being done in the life sciences and healthcare industries are having a tremendous impact. Many of these efforts require a large volume of computation. Bio-IT World is where thought leaders at the forefront of this work come together to share knowledge and help advance the state of the science.
Representatives from our genomics team will be on hand in booth #446 to showcase Microsoft’s solutions for secure and scalable genomics work. Our Azure Specialized Compute team will demo how Azure’s high performance computing offerings can help researchers get simple, managed access to the compute resources they need, whatever that need may be. In addition, our partners at DNAnexus will be presenting a demo in the Microsoft booth from 3:30 to 4:00pm on Wednesday.
Jason Stowe, Principal Group Program Manager in the Azure Specialized Compute team will be an instructor for the Bootstrapping your life science startup workshop on Tuesday afternoon. On Wednesday morning, he will introduce the plenary keynote. In this session, a panel of experts will discuss data science issues in life sciences.
On a weekly basis, more than 5 million people find help using Microsoft online help and documentation. Despite having more than 380,000 pages of articles, references, and tutorials, we each have our own stories of painful gaps.
Today, we’re introducing a new project to help, code-named Aladdin.
What is Aladdin? Aladdin is an AI assistant that connects you to relevant Azure documentation and helps you accomplish your work more efficiently. Just like any assistant, the more you work with Aladdin, the better it’ll be able to work for you. And as Aladdin helps you, it will be helping us identify gaps and areas for improvement in our docs and products.
Each prompt recommends contextually-relevant articles:
Click on a card for more information, and “go to original article’” to navigate to the source page in a new tab:
Ask Aladdin if you have questions – Aladdin surfaces expert responses to questions from real customers:
How does it work?
This is a continuation of our customer success story blog series for Azure Backup. In the previous case study we covered Russell Reynolds, here we will discuss how United Kingdom’s Somerset County Council are able to improve their backup and restore efficiency and reduce their backup cost using Azure Backup.
United Kingdom’s Somerset County Council provides government services to its 550,000 residents. It is one of the oldest local governments in the world, established about 700 A.D. Somerset had been using an in-house storage manager platform for their data backup and restore on-premises.
“The biggest problems we had were with flexibility and scalability. We had racks and racks of disks, and we had to wait a long time to get new hardware. The complexities with the product itself also introduced many challenges” says Dean Cridland, Senior IT Officer at Somerset County Council. In addition, as the data footprint grew, IT staff struggled to hit daily backup SLA. So, they were looking for a modern backup solution that could meet their ever-growing data footprint requirement, meet their backup SLA, and aligns with their strategy of moving to the cloud.
How Azure Backup helped
Somerset deployed Azure Backup Server to
SQL Operations Studio can be leveraged with Azure SQL Data Warehouse (SQL DW) to create rich customizable dashboard widgets surfacing insights to your data warehouse. This unlocks key scenarios around managing and tuning your data warehouse to ensure it is optimized for consistent performance. Previously, developers had to manually and continuously execute complex DMV queries to extract insights from their data warehouse. This leads to a repetitious process when following development and tuning best practices with SQL DW. Now with SQL Operations Studio, customized insight widgets can be embedded directly within the query tool enabling you to seamlessly monitor and troubleshoot issues with your data warehouse.
The following widgets can be generated by using the provided T-SQL monitoring scripts within SQL Operations Studio for common data warehouse insights.
Detect data skew across distributions to help identify and troubleshoot query performance issues:
Columnstore health and statistics
Identify and understand workload patterns through active sessions queries, queued queries, loads, and backups:
Ensure adequate resources are allocated such as memory and TempDB:
During the last few months, I’ve spoken with a lot of Azure customers, both in person and online, about how to prepare for the May 25, 2018 deadline for compliance with the EU’s General Data Protection Regulation (GDPR). The GDPR imposes new rules on companies, government agencies, non-profits, and other organizations that offer goods and services to people in the European Union (EU), or that collect and analyze data tied to EU residents. The GDPR applies no matter where you are located. The GDPR will dramatically shift the landscape for data collection and analysis, since under the GDPR, many practices that were commonplace will be forbidden, and companies must take care in assessing their exposure and how to comply.
I recently participated in a Microsoft series of webinars about the GDPR and its implications for IT teams and cloud computing. We got a lot of questions from the audience in these webinars, so I thought I would respond to some of the most frequently asked ones that we thought you might find helpful, along with links to the on-demand webinars.
Q: Does the GDPR allow me to send data outside the EU?
A: GDPR applies globally, so no matter
Microsoft will have a major presence at Spark + AI Summit, 2018, in San Francisco, the premier event for the Apache Spark community. Rohan Kumar, Corporate Vice President of Azure Data, will deliver a keynote on how Azure Databricks combines the best of Apache® Spark™ analytics platform and Microsoft Azure Data Services to help customers unleash the power of data and reimagine possibilities that will improve our world.
Azure Databricks, a fast, easy, and collaborative Apache Spark-based analytics platform optimized for Azure, was made generally available in March 2018. To learn more about the announcement, read Rohan Kumar’s blog about how Azure Databricks can help customers accelerate innovation and simplify the process of building Big Data & AI solutions. At Spark + AI Summit, we have a number of sessions showcasing the great work our customers and partners are doing and how Azure Databricks is helping them achieve productivity at scale.
Sign up for training on Spark!
On Monday, June 4, 2018 there are a number of full-day training courses on Apache Spark ranging from beginner to advanced that will enhance your skill set and even prepare you for certification on Spark.
Apache Spark essentials
This 1-day course is for
https://powerbi.microsoft.com/en-us/blog/register-now-for-the-microsoft-business-applications-summit-july-22-24-2018-in-seattle/Source: https://powerbi.microsoft.com/en-us/blog/register-now-for-the-microsoft-business-applications-summit-july-22-24-2018-in-seattle/ We’re excited to announce registration is open for the first-ever Microsoft Business Applications Summit, coming to Seattle July 22–24! Combining the best of the Microsoft Data Insights Summit, Convergence, and the Dynamics 365 READ MORE
We are pleased to announce that Azure SQL Data Warehouse (Azure SQL DW) now supports automatic creation of column level statistics. Azure SQL DW is a fast, flexible, and secure analytics platform for the enterprise.
Modern systems such as Azure SQL DW, rely on cost-based optimizers to generate quality execution plans for user queries. Even though Azure SQL DW implements a cost-based optimizer, the system relies on developers and administrators to create statistics objects manually. When all queries are known in advance, determining what statistics objects need to be created is an achievable task. However, when the system is faced with ad-hoc and random queries which is typical for the data warehousing workloads, system administrators may struggle to predict what statistics need to be created leading to potentially suboptimal query execution plans and longer query response times. One way to mitigate this problem is to create statistics objects on all the table columns in advance. However, that process comes with a penalty as statistics objects need to be maintained during table loading process, causing longer loading times.
Azure SQL DW now supports automatic creation of statistics objects providing greater flexibility, productivity, and ease of use for system administrators and developers,
We are pleased to announce a series of improvements on Application Insights Profiler and Snapshot Debugger. Profiler identifies the line of code that slowed down the web app performance under load. Snapshot Debugger captures runtime exception call stack and local variables to identify the issue in code. To ensure users can easily and conveniently use the tools, we delivered the following new features for Profiler and Snapshot Debugger:
Application Insights enablement with Profiler and Snapshot Debugger
With the newly enhanced Application Insights enablement experience, Profiler and Snapshot Debugger are default options to be turned on with Application Insights.
Enabling Snapshot Debugger without redeploy your web app: For ASP.NET core web app, snapshot debugger is a simple, default option when enabling App Insights. It used to require modifying the project to install NuGet and add exception tracking code. Now it’s done via an ASP.NET core hosting light up through an App Setting, no redeploy will be required. ASP.NET support will be available very soon.
Enabling Profiler with Application Insights in one step: Enabling Profiler used to be done in a separate Profiler Configuration pane, which requires extra steps. This is no longer needed.
On-demand profiler: Triggering a
Creating an advanced conversational system is now a simple task with the powerful tools integrated into Microsoft’s Language Understanding Service (LUIS) and Bot Framework. LUIS brings together cutting-edge speech, machine translation, and text analytics on the most enterprise-ready platform for creation of conversational systems. In addition to these features, LUIS is currently GDPR, HIPPA, and ISO compliant enabling it to deliver exceptional service across global markets.
Talk or text?
Bots and conversational AI systems are quickly becoming a ubiquitous technology enabling natural interactions with users. Speech remains one of the most widely used input forms that come natural when thinking of conversational systems. This requires the integration of speech recognition within the Language Understanding in conversational systems. Individually, speech recognition and language understanding are amongst the most difficult problems in cognitive computing. Introducing the context of Language Understanding improves the quality of speech recognition. Through intent-based speech priming, the context of an utterances is interpreted using the language model to cross-fertilize the performance of both speech recognition and language understanding. Intent based speech recognition priming uses the utterances and entity tags in your LUIS models to improve accuracy and relevance while converting audio to text. Incorrectly recognized spoken phrases or