02

Oct

World-class PyTorch support on Azure
World-class PyTorch support on Azure

Today we are excited to strengthen our commitment to supporting PyTorch as a first-class framework on Azure, with exciting new capabilities in our Azure Machine Learning public preview refresh. In addition, our PyTorch support extends deeply across many of our AI Platform services and tooling, which we will highlight below.

During the past two years since PyTorch’s first release in October 2016, we’ve witnessed the rapid and organic adoption of the deep learning framework among academia, industry, and the AI community at large. While PyTorch’s Python-first integration and imperative style have long made the framework a hit among researchers, the latest PyTorch 1.0 release brings the production-level readiness and scalability needed to make it a true end-to-end deep learning platform, from prototyping to production.

Four ways to use PyTorch on Azure Azure Machine Learning service

Azure Machine Learning (Azure ML) service is a cloud-based service that enables data scientists to carry out end-to-end machine learning workflows, from data preparation and training to model management and deployment. Using the service’s rich Python SDK, you can train, hyperparameter tune, and deploy your PyTorch models with ease from any Python development environment, such as Jupyter notebooks or code editors. With Azure ML’s deep

Share

02

Oct

Cooling down storage costs in the healthcare AI blueprint
Cooling down storage costs in the healthcare AI blueprint

Artificial Intelligence (AI) and Machine Learning (ML) are transforming healthcare. From streamlining operations to aiding in clinical diagnosis. Healthcare organizations are often challenged to begin an AI/ML journey due to lack of experience or high cost.

The Azure Healthcare AI blueprint installs a HIPAA and HITRUST compliant environment in Azure for managing and running healthcare AI experiments. This provides a quick start to your AI/ML efforts and can get technical staff proficient with a reference implementation very quickly and with little cost.

Since it is a reference implementation, you must consider the ongoing costs to maintain the blueprint infrastructure in production. One place to look for easy savings is in storage. In this entry, we’ll discuss features of Azure Blob Storage, and practices to lower the cost of blob storage.

The case for more storage: AI and cognitive services

The blueprint is designed to ease the learning and implementation of AI/ML in a healthcare organization. A “patient length of stay” experiment is included which uses .csv files that take up little room. But consider other data that could be used for machine learning. These include radiology (x-ray) and MRI data along with other radiological images. And as AI services

Share

02

Oct

Azure Marketplace new offers – Volume 20
Azure Marketplace new offers – Volume 20

We continue to expand the Azure Marketplace ecosystem. From August 16 to 31, 87 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

128T Networking Platform: The 128T Networking Platform enables organizations to rapidly connect to cloud infrastructure, bringing centralized visibility and management of policies across public and private cloud environments.

Ant Media Server Community Edition 1.4.1: Publish live streams with WebRTC or Real-Time Messaging Protocol (RTMP). Your live or video-on-demand streams can play anywhere, including on mobile browsers.

Ant Media Server Enterprise Edition: Ant Media Server Enterprise Edition supports low-latency WebRTC, Real-Time Messaging Protocol (RTMP), MP4, HLS (HTTP Live Streaming), adaptive bitrate, thumbnail generation, and more.

App360 Cloud Management Platform: After enterprises have virtualized their IT infrastructure environments, the next step is to enable centralized management and interoperability between public clouds and on-premises resources. App360 offers hybrid and multi-cloud management.

Application Load Balancer / ADC: Our application delivery controller (ADC) load balancer allows you to easily achieve security, traffic management, SSO/pre-authentication, and load balancing.

Share

01

Oct

Ansible 2.7: What’s new for Azure
Ansible 2.7: What’s new for Azure

Ansible 2.7 will be released 4 October 2018. Here I would like to share with you what are upcoming for Azure in Ansible 2.7. In total, 21 new Azure modules were added. In 2.7 you now have the ability to natively automate the deployment and configuration of below Azure resources.

Azure Web Apps: Create and configure your Azure Web Apps hosting web applications, REST APIs, and mobile backends using Ansible.

azure_rm_appserviceplan azure_rm_appserviceplan_facts azure_rm_webapp azure_rm_webapp_facts

Azure Traffic Manager: Create and configure Azure Traffic Manager to distribute traffic optimally to services across global Azure regions using Ansible.

azure_rm_trafficmanagerendpoint azure_rm_trafficmanagerendpoint_facts azure_rm_trafficmanagerprofile azure_rm_trafficmanagerprofile_facts

Azure Database: Create and configure an Azure Database for SQL/MySQL/PostgreSQL server using Ansible.

azure_rm_mysqldatabase_facts azure_rm_mysqlserver_facts azure_rm_postgresqldatabase_facts azure_rm_postgresqlserver_facts azure_rm_sqlfirewallrule

Azure Route: Create and configure your own routes to override Azure’s default routing using Ansible.

azure_rm_routetable azure_rm_routetable_facts azure_rm_route

Azure Applicate Gateway: Create and configure an Azure Application Gateway to manage web traffic using Ansible.

azure_rm_appgateway

Azure Autoscale: Create and configure Azure autoscale to help applications perform their best when demand changes using Ansible.

azure_rm_autoscale azure_rm_autoscale_facts

Additional facts module for VM and ACR: Get information about virtual machines or Azure container registry for further configuration using Ansible.

azure_rm_virtualmachine_facts azure_rm_containerregistry_facts Automate Azure Web Apps using

Share

01

Oct

Time synchronization for financial services in Azure
Time synchronization for financial services in Azure

You know the expression: “Time is money.” For many workloads in the capital markets space, time accuracy is money. Depending on the application and applicable regulation, financial transactions need to be traceable down to the second, millisecond, or even microsecond. Financial institutions themselves are under scrutiny to prove the validity and traceability of these transactions. At Microsoft, we want to ensure our customers are aware of time accuracy and synchronization best practices, and how you can mitigate the risk of negative impact due to time synchronization issues on Azure.

Time accuracy for a computer clock generally refers to how close the computer clock is to Coordinated Universal Time (UTC), the current time standard. In turn, UTC is based on International Atomic Time (TAI), a measure of time that combines the output of some 400 atomic clocks worldwide that yield approximately 1 second of deviation from mean solar time at 0° longitude. While the most precise time accuracy can be achieved by reading the time from these reference clocks themselves, it is impractical to have a GPS receiver attached to every machine in a datacenter. Instead, a network of time servers, downstream from these systems of record, are used to achieve

Share

01

Oct

Gaining insights from industrial IoT data with Microsoft Azure

Technology is moving at an amazing pace. Manufacturers around the world are observing this first-hand. Additive manufacturing, robotics, and IoT are some of the technologies that directly influence the way manufacturing businesses operate. The manufacturing industry is not isolated from the huge leaps in computer technology. Software packages used for managing complex processes and fabrication machine tools (for example, Computer numerical control milling and CNC turning) are everywhere and are generating huge amounts of data.

I have heard from many customers in the manufacturing industry that they are not sure what steps they need to take to gain insights from all the data they have.

My recommendation is to start small. Do not make big investments right at the start, but first discover what can be gained from the available choices. Then bring it into a platform that can provide many other possibilities, such as machine learning and AI. The Azure platform features many application services. But important to manufacturers, Azure gives access to hardware resources, such as faster CPUs, field programmable gate arrays (FPGA), and graphics processing units (GPU) — all are easily accessible for state-of-the-art solutions.

What is starting small? What is possible? While seeking answers to

Share

01

Oct

Protecting banks through cloud-based sanction screening solutions

At a previous position, I owned the software and hardware testing across a 6,000-branch network for a large fortune 100 bank in the U.S. The complexity and sophistication of the end-to-end delivery of products and services to existing customers was daunting. Vetting new, potential customers while simultaneously building a new system was tough. Especially since the new system had to balance a pleasant front-end user experience with the backend processes from strong Know Your Customer (KYC) scrubbing (in other words, due diligence). That backend system was invisible, batch-based, and only had post-transaction look back capability. I learned that banks can have their cake and eat it too, but the business implications of limiting user friction are not trivial, and properly vetting customers puts a lot of pressure on the technology capabilities.

As a compliment to my online and mobile fraud theme this quarter see Detecting Online and Mobile Fraud with AI, I’ll provide some insights on how banks are seeking to rationalize and simplify security and compliance processes in real-time. The path stretches from the device to the network and back-end infrastructure. The goal is to ease the burden on employees and reduce costs from fines. Specifically, the requirements

Share

01

Oct

Healthcare costs are skyrocketing! Reduce costs and optimize with AI

In 2016, healthcare costs in the US are estimated at nearly 18 percent of the GDP! Cost reduction is a high priority for all types of healthcare organizations, but especially healthcare providers such as hospitals, clinics, and many others. Optimizing operations within healthcare providers offers substantial benefits in both efficiency and cost reduction. In this use case, healthcare providers seek to optimize resource and asset allocation over time. This can include the allocation of medical devices, staffing of healthcare professionals, and other key aspects of operations. Efficient and affordable healthcare can help significantly improve the experience of patients, healthcare professionals, and even improve the quality of care and patient outcomes. Especially in emergency situations where the lack of healthcare resources or asset allocation could impact patient care. For healthcare professionals under time and cost reduction pressure, and increasingly at risk of clinician burnout, efficient operations and improved experience can be a significant morale booster.

Below I highlight three strategies to optimize healthcare using artificial intelligence.

Optimize your healthcare operations using AI

Artificial intelligence (AI) and machine learning (ML) have great potential to help healthcare providers identify opportunities to optimize their operations and realize cost savings. For example, AI could

Share

01

Oct

HDInsight Enterprise Security Package now generally available
HDInsight Enterprise Security Package now generally available

Enterprise Security Package GA for HDInsight 3.6

The HDInsight team is excited to announce the general availability of Enterprise Security Package (ESP) for Apache Spark, Apache Hadoop and Interactive Query clusters in HDInsight 3.6. When enterprise customers share clusters between multiple employees, Hadoop admins must ensure those employees have the right set of accesses and permissions to perform big data operations. In enterprises, multi-user access with granular authorization using the same identities in the enterprise is a complex and lengthy process. Enabling ESP with the new experience provides authentication and authorization for these clusters in a more streamlined and secure manner.

For authentication, open source Apache Hadoop relies on Kerberos. Customers can enable Azure AD Domain Services (AAD-DS) as the main domain controller and use that for domain joining of the clusters. The same identities available in AAD-DS will then be able to login to the cluster.

For authorization, customers can set Apache Ranger policies to get fine-grained authorization in their clusters. Apache Hive and Yarn Ranger plugins are available for setting these policies.

To learn more about ESP and how to enable it, see our documentation.

Public preview of ESP for Apache Kafka and HBase

We are also expanding

Share

01

Oct

Advanced Threat Protection for Azure Storage now in public preview

We are excited to announce that this week we have made Advanced Threat Protection available for public preview on Azure Storage Blob service. Advanced Threat Protection for Azure Storage detects anomalous activities indicating unusual and potentially harmful attempts to access or exploit storage accounts.

The introduction of this feature helps customers detect and respond to potential threats on their storage account as they occur. For a full investigation experience, it is recommended to configure diagnostic logs for read, write, and delete requests for the blob services.

The benefits of Advanced Threat Protection for Azure Storage include:

Detection of anomalous access and data exfiltration activities. Email alerts with actionable investigation and remediation steps. Centralized views of alerts for the entire Azure tenant using Azure Security Center. Easy enablement from Azure portal. How to set up Advanced Threat Protection Launch the Azure portal. Navigate to the configuration page of the Azure Storage account you want to protect. In the Settings page, select Advanced Threat Protection. In the Advanced Threat Protection configuration blade: Turn on Advanced Threat Protection. Click Save to save the new or updated Advanced Threat Protection policy.

Get started today

We encourage you to try out Advanced Threat

Share