This post is authored by Mathew Salvaris and Fidan Boylu Uz, Senior Data Scientists at Microsoft.
One of the major challenges that data scientists often face is closing the gap between training a deep learning model and deploying it at production scale. Training of these models is a resource intensive task that requires a lot of computational power and is typically done using GPUs. The resource requirement is less of a problem for deployment since inference tends not to pose as heavy a computational burden as training. However, for inference, other goals also become pertinent such as maximizing throughput and minimizing latency. When inference speed is a bottleneck, GPUs show considerable performance gains over CPUs. Coupled with containerized applications and container orchestrators like Kubernetes, it is now possible to go from training to deployment with GPUs faster and more easily while satisfying latency and throughput goals for production grade deployments.
In this tutorial, we provide step-by-step instructions to go from loading a pre-trained Convolutional Neural Network model to creating a containerized web application that is hosted on Kubernetes cluster with GPUs on Azure Container Service (AKS). AKS makes it quick and easy to deploy and manage containerized applications without much
Many customers are using Azure Service Fabric to build and operate always-on, highly scalable, microservice applications. Recently, we open sourced Service Fabric with the MIT license to increase opportunities for customers to participate in the development and direction of the product. Today, we are excited to announce the release of Service Fabric runtime v6.2 and corresponding SDK and tooling updates.
This release includes:
The general availability of Java and .NET Core Reliable Services and Actors on Linux Public preview of Red Hat Enterprise clusters Enhanced container support Improved monitoring and backup/restore capabilities
The updates will be available in all regions over the next few days and details can be found in the release notes.
Reliable Services and Reliable Actors on Linux is generally available
Reliable Services and Reliable Actors are programming models to help developers build stateless and stateful microservices for new applications and for adding new microservices to existing applications. Now you can use your preferred language to build Reliable Services and Actors with the Service Fabric API using .NET Core 2.0 and Java 8 SDKs on Linux.
Red Hat Enterprise clusters in public preview
Azure Service Fabric clusters
We are happy to announce the Azure IaaS VM backup support for network restricted storage accounts. With storage firewalls and Virtual Networks, you can allow traffic only from selected virtual networks and subnets. With this you can create a secure network boundary for your unmanaged disks in storage accounts. You can also grant access for on-premises networks and other trusted internet traffic, by using network rules based on IP address ranges. With this announcement, we provide an ability for the user to perform and continue with scheduled and ad-hoc IaaS VM backups and restores for these VNET configured storage accounts.
After you configure firewall and virtual network settings for your storage account, select Allow trusted Microsoft services to access this storage account as an exception to enable Azure Backup service to access the network restricted storage account.
This network focused feature gives the customer a seamless experience by defining network access-based security. This ensures that only requests coming from approved Azure VNETs or specified public IP ranges will be allowed to a specific storage account making it more secure and thus fulfilling the compliance requirements of an organization.
Related links and additional content Learn more about
My high school physics teacher taught us about metal fatigue by having everyone bend paper clips back and forth until they broke. In the real world, engineers use computer simulations to test their designs. From the trivial paperclip to the life-saving crash analysis, computer-aided engineering (CAE) improves products around us every day. But accessing the massive power needed for these simulations can be tough for small organizations.
That’s where our partners at Altair have stepped in. Altair is democratizing access to CAE by building their Software-as-a-Service (SaaS) offerings on Microsoft Azure. In a case study we recently published, Altair describes how their HyperWorks Unlimited Virtual Appliance gives customers the combination of software and scale they need to quickly run their CAE workloads.
But that’s not the end of the story. Altair recently brought their Inspire software to a SaaS model as well. Inspire Unlimited provides a visual cloud collaboration platform for engineering. Inspire Unlimited attains the required scalability by onboarding multiple users on a virtual machine. Using Azure’s NV-series virtual machines, which feature NVIDIA Tesla M60 GPUs, Altair’s customers can get powerful virtual workstations without having to purchase expensive hardware. This allows users to collaborate with only a web browser,
We are pleased to announce that Spring Data Azure Cosmos DB is now available to provide essential Spring Data support for Azure Cosmos DB using SQL API. Azure Cosmos DB is Microsoft’s globally distributed, multi-model database service with exceptional scalability and performance.
With Spring Data Azure Cosmos DB, Java developers now can get started quickly to build NoSQL data access for their apps on Azure. It offers a Spring-based programming model for data access, while keeping the special traits of the underlying data store with Azure Cosmos DB. Features of Spring Data Azure Cosmos DB include a POJO centric model for interacting with an Azure Cosmos DB Collection, and an extensible repository style data access layer.
Download the Spring Data Azure Cosmos DB Sample Project to get started. The sample illustrates the process to use annotation to interact with Collection, customize a query operation with specific fields, and expose a discoverable REST API for clients.
Create a new database instance
To get started, first create a new database instance by using the Azure portal. You can find Azure Cosmos DB in Databases and choose SQL (Document DB) for the API. When your database has been created, you
This blog post was co-authored by JR Mayberry, Principal PM Manager, Azure Networking.
Today we are excited to announce the general availability of the Azure DDoS Protection Standard service in all public cloud regions. This service is integrated with Azure Virtual Networks (VNet) and provides protection and defense for Azure resources against the impacts of DDoS attacks.
Distributed Denial of Service (DDoS) attacks are intended to disrupt a service by exhausting its resources (e.g., bandwidth, memory). DDoS attacks are one of the top availability and security concerns voiced by customers moving their applications to the cloud. With extortion and hacktivism being the common motivations behind DDoS attacks, they have been consistently increasing in type, scale, and frequency of occurrence as they are relatively easy and cheap to launch.
These concerns are justified as the number of documented DDoS amplification attacks increased by more than 357 percent in the fourth quarter of 2017, compared to 2016 according to data from Nexusguard. Further, more than 56 percent of all attacks exploit multiple vector combinations. In February 2018, Github was attacked via a reflection exploit in Memcached generating 1.35 terabits of attack traffic, the largest DDoS attack ever recorded.
As the types and
Today, we are excited to announce the general availability of Transparent Data Encryption (TDE) with Bring Your Own Key (BYOK) support for Azure SQL Database and Azure SQL Data Warehouse. This is one of the most frequently requested features by enterprise customers looking to protect sensitive data and meet regulatory or compliance obligations that require implementation of specific key management controls. TDE with BYOK support is offered in addition to TDE with service managed keys, which is enabled by default on all new Azure SQL Databases.
TDE with BYOK support uses Azure Key Vault, which provides highly available and scalable secure storage for RSA cryptographic keys backed by FIPS 140-2 Level 2 validated Hardware Security Modules (HSMs). Key Vault streamlines the key management process and enables customers to maintain full control of encryption keys and allows them to manage and audit key access.
Customers can generate and import their RSA key to Azure Key Vault and use it with Azure SQL Database and Azure SQL Data Warehouse TDE with BYOK support. Azure SQL Database handles the encryption and decryption of data stored in databases, log files, and backups in a fully transparent fashion by using a symmetric Database Encryption Key
In the past, Azure customers on Enterprise Agreement (EA) have subscriptions that are centrally controlled by the company’s cloud operations or IT team. When a team or employee in the company wants to start using Azure, they need to get access to the EA enrollment so that it gets billed to the company EA. To do that, the employee or team makes a request to the central cloud operations team, go through approval, and have an Azure subscription provisioned as prescribed by the company’s cloud governance policies. During this process, an EA subscription must be manually created using the Azure Account Center. As these company’s Azure adoption increases, the manual step in creating subscriptions becomes a bottleneck in scalability in their cloud management.
To unblock these customers, we’ve created an API and a suite of SDK for Azure EA subscription creation.
Get started with Azure EA subscription creation API
To get started, see documentation at Programmatically create Azure enterprise subscriptions (preview) and our sample code. In this release, you can
Create an Azure EA subscription (regular or dev/test) as an Account Owner. Use Azure RBAC to give another user or service principal to create subscriptions on behalf of an Account
We continue to expand the Azure Marketplace ecosystem. In March 2018, 55 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Kentico on Windows Server 2012 R2: Kentico CMS is a free edition web content management system for building websites, online stores, intranets, and community sites. Create, manage, and integrate communities socially to encourage conversations about your brand.
OpenText Process Suite 16.3 Marketplace Info VM: With intelligently automated, content-rich processes that you can quickly build and easily modify, Process Suite gives you the power to deliver a variety of new digital experiences with a much lower IT workload.
Content Suite 16 (January 2018): OpenText Content Suite Platform is a comprehensive enterprise content management (ECM) system designed to manage the flow of information from capture through archiving and disposition.
BigDL Spark Deep Learning Framework VirtualMachine: Deep Learning framework for distributed computing designed for Apache Spark architecture and highly optimized for Intel Xeon CPUs. Feature-parity with TensorFlow, Caffe, etc., without the need for GPUs.
Gallery Server on Windows Server 2012 R2: Gallery Server is a free, open source, easy-to-use Digital
At Microsoft, our approach is to listen to customers and bring solutions and tools that can help solve their problems. It is at the heart of everything we do. It is the same listening process that got us to PostgreSQL, and a couple of years back we embarked on the journey to bring PostgreSQL as a fully managed database service on Azure. We reached a key milestone towards that journey when we recently announced the general availability of Azure Database for PostgreSQL.
Attending community and customer events is always special for me – it’s an opportunity to engage with and learn from some of the leading minds in the industry. PostgresConf US 2018 is even more special given how much support we have received from the Postgres community and I look forward to meeting community leaders, customers, and partners at the event. I’ll also be joined by a few of my colleagues, and while you can find us at the Microsoft booth, you can also attend product deep dive session by Sunil Kamath and a GDPR session by Mark Bolz.
During my keynote at the conference, I’ll share some of our learnings leading to the general availability of Azure Database