18

Jun

Dive into blockchain for healthcare with the HIMSS blockchain webinar

Excitement around the potential of blockchain in healthcare has reached all-time highs and is accelerating. There is a hunger building across the industry for real use cases with real healthcare organizations transacting on blockchain, and proof points, case studies, or success stories that outline business values sought, results achieved, what worked well, and what needs improvement. Only through such hands-on experience can we validate the true potential of blockchain in healthcare, refine its application, improve any deficiencies identified, build trust, and scale it both in terms of the networks of healthcare organizations participating in various blockchain initiatives, as well as scaling blockchain in applying it to additional healthcare use cases.

The HIMSS Blockchain Work Group is forum of leaders from across healthcare including providers, payers, pharmaceuticals, life sciences, and industry experts worldwide, collaborating to advance the applications of blockchain in healthcare. I have the honor of participating in this group. At the recent HIMSS18 conference in Las Vegas I had the privilege of moderating a session, organized by the HIMSS Blockchain Work Group, with a panel of worldwide experts on blockchain in healthcare. If you missed that event you can now view the video on demand. At this event we

Share

18

Jun

Siphon: Streaming data ingestion with Apache Kafka
Siphon: Streaming data ingestion with Apache Kafka

Data is at the heart of Microsoft’s cloud services, such as Bing, Office, Skype, and many more. As these services have grown and matured, the need to collect, process and consume data has grown with it as well. Data powers decisions, from operational monitoring and management of services, to business and technology decisions. Data is also the raw material for intelligent services powered by data mining and machine learning.

Most large-scale data processing at Microsoft has been done using a distributed, scalable, massively parallelized storage and computing system that is conceptually similar to Hadoop. This system supported data processing using a batch processing paradigm. Over time, the need for large scale data processing at near real-time latencies emerged, to power a new class of ‘fast’ streaming data processing pipelines.

Siphon – an introduction

Siphon was created as a highly available and reliable service to ingest massive amounts of data for processing in near real-time. Apache Kafka is a key technology used in Siphon, as its scalable pub/sub message queue. Siphon handles ingestion of over a trillion events per day across multiple business scenarios at Microsoft. Initially Siphon was engineered to run on Microsoft’s internal data center fabric. Over time, the

Share

18

Jun

Announcing the general availability of Azure SQL Data Sync

We are delighted to announce the general availability (GA) of Azure SQL Data Sync! Azure SQL Data Sync allows you to synchronize data between Azure SQL Database and any other SQL endpoints unidirectionally or bidirectionally. It enables hybrid SQL deployment and allows local data access from both Azure and on-premises application. It also allows you to deploy your data applications globally with a local copy of data in each region, and keep data synchronized across all the regions. It will significantly improve the application response time and reliability by eliminating the impact of network latency and connection failure rate.

What’s new in Azure SQL Data Sync

With the GA announcement, Azure SQL Data Sync supports some new capabilities:

Better configuration experience – More reliable configuration workflow and more intuitive user experience. More reliable and faster database schema refresh – Load database schema more efficiently using new SMO library. More secured data synchronization – We reviewed the end-to-end sync workflow and ensured user data are always encrypted at rest and in transit. Data Sync service now meets GDPR compliance requirement. Get started today – Try out Azure SQL Data Sync

If you are building a hybrid platform or global distributed

Share

18

Jun

Top 8 reasons to choose Azure HDInsight
Top 8 reasons to choose Azure HDInsight

Household names such as Adobe, Jet, ASOS, Schneider Electric, and Milliman are amongst hundreds of enterprises that are powering their Big Data Analytics using Azure HDInsight. Azure HDInsight launched nearly six years ago and has since become the best place to run Apache Hadoop and Spark analytics on Azure.

Here are the top eight reasons why enterprises are choosing Azure HDInsight for their big data applications:

1. Fully managed cluster service for Apache Hadoop and Spark workloads: Spin up Hive, Spark, LLAP, Kafka, HBase, Storm, or R Server clusters within minutes, deploy and run your applications and allow HDInsight do the rest. We will monitor the cluster and all the services, detect and repair common issues and respond to issues 24/7.

2. Guaranteed high availability (99.9 percent SLA) at large scale: Run your most critical and time sensitive workloads across thousands of cores and TBs of memory under the assurance of an industry-leading availability SLA of 99.9 percent for the whole software stack. Your big data applications can run more reliably as your HDInsight service monitors the health and automatically recovers from failures.

3. Industry-leading end to end security and compliance: Protect your most sensitive enterprise data assets using the

Share

14

Jun

Container Tooling for Service Fabric in Visual Studio 2017
Container Tooling for Service Fabric in Visual Studio 2017

The latest version of the Service Fabric tools, which is part of Visual Studio 2017 Update 7 (15.7), includes the new container tooling for Service Fabric feature. This new feature makes debugging and deploying existing applications, in containers, on Service Fabric easier than ever before.

Containerize .NET Framework and .NET Core applications and run them on Service Fabric

You can now take an existing console or ASP.NET application, deploy it to a container image, and run and debug it in Service Fabric as a container on your local developer workstation. With a few clicks, you can make your existing .NET application run in a container in a Service Fabric environment. Simply right-click on your project in Solution Explorer and selecting Add –> Container Orchestrator Support. This will display a dialog box where you select Service Fabric and click OK.

Doing this will create a Docker file in your project and add the required Service Fabric files, as well as create a new Service Fabric application project in the solution. If your project is part of a solution with an existing Service Fabric application, it will be added to that application automatically. This must be done for each project in

Share

14

Jun

Container Tooling for Service Fabric in Visual Studio 2017
Container Tooling for Service Fabric in Visual Studio 2017

The latest version of the Service Fabric tools, which is part of Visual Studio 2017 Update 7 (15.7), includes the new container tooling for Service Fabric feature. This new feature makes debugging and deploying existing applications, in containers, on Service Fabric easier than ever before.

Containerize .NET Framework and .NET Core applications and run them on Service Fabric

You can now take an existing console or ASP.NET application, deploy it to a container image, and run and debug it in Service Fabric as a container on your local developer workstation. With a few clicks, you can make your existing .NET application run in a container in a Service Fabric environment. Simply right-click on your project in Solution Explorer and selecting Add –> Container Orchestrator Support. This will display a dialog box where you select Service Fabric and click OK.

Doing this will create a Docker file in your project and add the required Service Fabric files, as well as create a new Service Fabric application project in the solution. If your project is part of a solution with an existing Service Fabric application, it will be added to that application automatically. This must be done for each project in

Share

14

Jun

Publish your solutions to Azure Government – what, why and how

The Azure Marketplace is the premier destination for all your software needs, optimized to run on Azure. The Azure Government Marketplace includes many of the same solutions for use in Azure Government, an exclusive instance of Microsoft Azure that enables government customers to safely transfer mission-critical workloads to the cloud.

Software companies that want to bring their solutions into Azure Government can learn more by watching this episode of the Azure Government video series. Publish your solutions to Azure Government – What, Why and How (Video).

In this episode Steve Michelotti, Senior Program Manager, Azure Government talks with Sarah Weldon, Program Manager, Azure Government about the Azure Government Marketplace. In this video they cover the “what, why and how” of using this platform reaching new government customers.

You’ll learn about the compliance benefits of Azure Government, and how to move into this space if you’ve already published software in the Azure Marketplace. Steve and Sarah include a demo of the publishing process for the Azure Government Marketplace with additional guidance to help ensure your success.

Also, be sure to subscribe to the Microsoft Azure YouTube Channel to see the latest videos on the Azure Government playlist.

About Azure Government

Share

14

Jun

Publish your solutions to Azure Government – what, why and how

The Azure Marketplace is the premier destination for all your software needs, optimized to run on Azure. The Azure Government Marketplace includes many of the same solutions for use in Azure Government, an exclusive instance of Microsoft Azure that enables government customers to safely transfer mission-critical workloads to the cloud.

Software companies that want to bring their solutions into Azure Government can learn more by watching this episode of the Azure Government video series. Publish your solutions to Azure Government – What, Why and How (Video).

In this episode Steve Michelotti, Senior Program Manager, Azure Government talks with Sarah Weldon, Program Manager, Azure Government about the Azure Government Marketplace. In this video they cover the “what, why and how” of using this platform reaching new government customers.

You’ll learn about the compliance benefits of Azure Government, and how to move into this space if you’ve already published software in the Azure Marketplace. Steve and Sarah include a demo of the publishing process for the Azure Government Marketplace with additional guidance to help ensure your success.

Also, be sure to subscribe to the Microsoft Azure YouTube Channel to see the latest videos on the Azure Government playlist.

About Azure Government

Share

14

Jun

Quick Recovery Time with SQL Data Warehouse using User-Defined Restore Points

We are excited to announce that SQL Data Warehouse (SQL DW) now supports User-Defined Restore Points! SQL DW is a flexible and secure analytics platform for the enterprise optimized for running complex queries fast across petabytes of data.

Previously, SQL DW supported only automated snapshots guaranteeing an eight-hour recovery point objective (RPO). While this snapshot policy provided high levels of protection, customers asked for more control over restore points to enable more efficient data warehouse management capabilities leading to quicker times of recovery in the event of any workload interruptions or user errors. 

Now, with user-defined restore points, in addition to the automated snapshots, you can initiate snapshots before and after significant operations on your data warehouse. With more granular restore points, you ensure that each restore point is logically consistent and limit the impact and reduce recovery time of restoring the data warehouse should this be needed. User-defined restore points can also be labeled so they are easy to identify afterwards. 

You can seamlessly create a restore point with a single statement in PowerShell, so it’s easy to integrate with your data warehouse management operations. You can have up to 42 restore points at any point, and as all

Share

14

Jun

Quick Recovery Time with SQL Data Warehouse using User-Defined Restore Points

We are excited to announce that SQL Data Warehouse (SQL DW) now supports User-Defined Restore Points! SQL DW is a flexible and secure analytics platform for the enterprise optimized for running complex queries fast across petabytes of data.

Previously, SQL DW supported only automated snapshots guaranteeing an eight-hour recovery point objective (RPO). While this snapshot policy provided high levels of protection, customers asked for more control over restore points to enable more efficient data warehouse management capabilities leading to quicker times of recovery in the event of any workload interruptions or user errors. 

Now, with user-defined restore points, in addition to the automated snapshots, you can initiate snapshots before and after significant operations on your data warehouse. With more granular restore points, you ensure that each restore point is logically consistent and limit the impact and reduce recovery time of restoring the data warehouse should this be needed. User-defined restore points can also be labeled so they are easy to identify afterwards. 

You can seamlessly create a restore point with a single statement in PowerShell, so it’s easy to integrate with your data warehouse management operations. You can have up to 42 restore points at any point, and as all

Share