Azure Security Center provides several threat prevention mechanisms to help you reduce surface areas susceptible to attack. One of those mechanisms is Just-in-Time (JIT) VM Access. Today we are excited to announce the general availability of Just-in-Time VM Access, which reduces your exposure to network volumetric attacks by enabling you to deny persistent access while providing controlled access to VMs when needed.
When you enable JIT for your VMs, you can create a policy that determines the ports to be protected, how long ports remain open, and approved IP addresses from where these ports can be accessed. The policy helps you stay in control of what users can do when they request access. Requests are logged in the Azure Activity Log, so you can easily monitor and audit access. The policy will also help you quickly identify existing virtual machines that have JIT enabled and virtual machines where JIT is recommended.
Looking to transform your business by improving your on-premises environments, accelerating your move to the cloud, and gaining transformative insights from your data? Here’s your opportunity to learn from the experts and ask the questions that help your organization move forward.
Join us for one or all of these training sessions for a deep dive into a variety of topics, including products like SQL Server 2017, Azure SQL Database, and Azure Cosmos DB, along with Microsoft innovations in artificial intelligence, advanced analytics, and big data.
SQL Server 2017
It’s all about choice. Now, you have the flexibility of leveraging SQL Server 2017’s industry-leading performance and security wherever you like—whether that’s on Windows, Linux, or Docker containers. We’re hosting two training sessions this month to help you learn more about the many exciting features of SQL Server 2017.
Industry-leading performance and security with SQL Server 2017
In this webinar, learn more about innovative SQL Server 2017 features that enhance your applications, analytics, and business intelligence (BI) workloads, including:
Automated tuning features such as Adaptive Query Processing and Automatic Plan Correction for faster, more consistent performance. Advanced security features such as encryption at rest and in use, dynamic data masking, and row-level
When we announced the preview of our new NCv3 virtual machines back in November, I knew they’d be very popular with our customers. NCv3 brings NVIDIA’s latest GPU – the Tesla V100 – to our best-in-class HPC, machine learning, and AI products to bring huge amounts of value across a variety of industries. One preview customer told us their speech recognition models trained in less than 20 minutes, instead of the 1-2 hours that previous generation GPUs required. Another customer told us about the 40-50% performance boost they saw on their reservoir simulations.
With these fantastic customer success stories, I am ecstatic to announce that the NCv3 virtual machines are now generally available in the US East region. We’ll be adding NCv3 to EU West and US South Central later this month. We’ll add AP Southeast in April and UK South and IN Central in May.
But this isn’t the only GPU announcement I am making today. We’re also expanding our NV series, which enables powerful remote visualization applications, into the US East 2, US Gov Virginia, and Central India regions. And our ND series, designed for AI and machine learning workloads, are expanding into the US South Central, AP
Today, we are announcing the public preview of Traffic Analytics, a cloud-based solution that provides visibility into user and application traffic on your cloud networks.
Traffic Analytics analyzes NSG Flow Logs across Azure regions and equips you with actionable information to optimize workload performance, secure applications and data, audit your organization’s network activity and stay compliant.
With Traffic Analytics, you now can:
Gain visibility into network activity across your cloud networks. Solution provides insights on: traffic flows across your networks between Azure and Internet, in Azure, public cloud regions, VNETs and subnets. inter-relationships between critical business services and applications. applications and protocols on your network, without the need for sniffers or dedicated flow collector appliances. Secure your network; Identify threats on your network, such as: flows between your VMs and rogue networks. network ports open to the Internet. applications attempting Internet access. anomalous network traffic behavior (e.g. back-end servers attempting connectivity, to servers outside your network etc.) Improve performance of your applications by: capacity planning – eliminate issues of over-provisioning or under utilization by monitoring utilization trends of VPN gateways and other services. analyzing in-bound and out-bound flows. understanding application access patterns (e.g. Where are
This post is co-authored by the Microsoft Azure Machine Learning team, in collaboration with Databricks Machine Learning team.
Apache Spark is being increasingly used for deep learning applications for image processing and computer vision at scale. Problems such as image classification or object detection are being solved using deep learning frameworks such as Cognitive Toolkit (CNTK), TensorFlow, BigDL and DeepLearning4J, and integrated into Spark through libraries such as MMLSpark or TensorFlowOnSpark. However, until now, there hasn’t been a common interface for importing images, or representing images in Spark DataFrames. Consequently, the different frameworks cannot easily communicate with each other or with core Spark components such as SparkML pipelines or Deep Learning pipelines. To overcome this problem, the Microsoft Azure Machine Learning Team collaborated with Databricks and the Apache Spark community to make images a first-class citizen in core Spark, based on existing industrial standards.
Importing and Representing Images in Spark DataFrames
An image processing and computer vision pipeline typically consists of the image import, preprocessing, model training and inferencing stages, depicted below.
To accurately represent an image throughout this pipeline, you need certain pieces of data:
The pixel values that represent the image itself.
Image resolution or bit
We are happy to announce that job monitoring and job view have been added into the Azure Data Lake Tools for Visual Studio Code. Now, you can perform real-time monitoring for the jobs you submit. You can also view job summary and job details for historical jobs as well as download any of the input or output data and resources files associated with the job.
Key Customer Benefits Monitor job progress in real-time within VSCode for both local and ADL jobs. Display job summary and data details for historical jobs. Resubmit previously run Enable jobs resubmission for an old job. Download job inputs, outputs and resource data files. View the job U-SQL script for a submitted job. Summary of key new features
Job View Page: Display job summary and job progress within VSCode.
Data Page: Display job input, output and resources files. Support file download.
Show Historical Jobs: Use command ADL: Show Jobs for both local and ADL historical jobs.
Set Default Context: Use command ADL: Set Default Context to set default context for current working folder.
How to install or update
A post on the Microsoft Imagine blog (Students: Get career-ready with Azure for Students) announced the new Azure for Students offer, which gets verified students started with US$100 in Azure credits to be used within the first 12 months plus select free services (subject to change) without requiring a credit card at sign-up. For all the details, see the Azure for Students FAQ.
Now in preview
Announcing new milestones for Microsoft Cognitive Services vision and search services in Azure – Microsoft Custom Vision service, now available in public preview on the Azure portal, makes it possible for developers to easily train a classifier with their own data, export the models and embed these custom classifiers directly in their applications, and run it offline in real time on iOS, Android and many other edge devices. In addition, Bing Entity Search, which developers can use to identify the most relevant entity based on searched terms and provide primary details about those entities is now generally available on the Azure portal.
Now generally available
Confidently plan your cloud migration: Azure Migrate is now generally available! – Azure Migrate is offered at no additional charge and provides appliance-based, agentless discovery of your on-premises environments.
I am excited to share our new Azure Security and Compliance Blueprint for HIPAA/HITRUST – Health Data & AI. Microsoft’s Azure Blueprints are resources to help build and launch cloud-powered applications that comply with stringent regulations and standards. Included in the blueprints are reference architectures, compliance guidance and deployment scripts.
“The best part of the Azure Security & Compliance Blueprint is that it encompasses the exact Azure services architecture required to help customers meet their HIPAA and HITRUST security, privacy, and compliance obligations, along with supporting documentation and a fully-automated deployment process.”
– Tibi Popp, CTO, Archive360
Health organizations all over the world are looking to leverage the power of AI and the cloud to improve outcomes, accelerate performance, and enable the vision of precision medicine. “We are enthusiastic about the potential to foster multi-institutional collaborative environments for data sharing and machine learning,” said Chuck Mayo, PhD at the University of Michigan Medicine. Microsoft is working to meet these challenges with Healthcare NExT, an initiative which aims to accelerate healthcare innovation through artificial intelligence and cloud computing, while at the same time working to protect the privacy and confidentiality of patients.
“We are entrusted with our customer’s
Our goal with Azure Monitoring tools is to provide full-stack monitoring for your applications. The top of this “stack” isn’t the client-side of your app, it’s your users themselves. Understanding user behavior is critical for making the right changes to your apps to drive the metrics your business cares about.
Recent improvements to the usage analytics tools in Application Insights can help your team better understand overall usage, dive deep into the impact of performance on customer experience, and give more visibility into user flows.
A faster, more insightful experience for Users, Sessions, and Events
Understanding application usage is critical to making smart investments with your development team. An application can be fast, reliable, and highly available, but if it doesn’t have many users, it’s not contributing value to your business.
The Users, Sessions, and Events tools in Application Insights make it easy to answer the most basic usage analytics question, “How much does my application and each of its features get used?”
We’ve re-built the Users, Sessions, and Events tools to make them even more responsive. A new sidebar of daily and monthly usage metrics help you spot growth and retention trends. Clicking on each metric gives you
This post series provides the latest updates and news for Visual Studio Team Services and is a great way for Azure users to keep up-to-date with new features being released every three weeks. Visual Studio Team Services offers the best DevOps tooling to create an efficient continuous integration and release pipeline to Azure. With the rapidly expanding list of features in Team Services, teams can start to leverage it more efficiently for all areas of their Azure workflow, for apps written in any language and deployed to any OS.
Azure Red Shirt Dev Tour: Our VSTS account
Scott Guthrie has been traveling the world on a tour he’s called the Azure Red Shirt Dev Tour. As part of that, he shows the account our team uses to build VSTS. That’s right – we use VSTS to plan, build, test, and release VSTS. See what VSTS looks like for a large team in Scott’s demo of VSTS using our account (mseng.visualstudio.com) – showing ongoing work on VSTS live on stage – from the New York City stop on the tour. If you want to go deep on how our team works, check out DevOps at Microsoft.