We have heard from many customers about their challenges with detecting highly evasive threats. To help provide guidance, we published Windows DNS server logging for network forensics and the introduction of the Azure DNS Analytics solution. Today, we are discussing some of our more complex, heuristic techniques to detect malicious use of this vital protocol and how these detect key components of common real-world attacks.
These analytics focus on behavior that is common to a variety of attacks, ranging from advanced targeted intrusions to the more mundane worms, botnets and ransomware. Such techniques are designed to complement more concrete signature-based detection, giving the opportunity to identify such behavior prior to the deployment of analyst driven rules. This is especially important in the case of targeted attacks, where time to detection of such activity is typically measured in months. The longer an attacker has access to a network, the more expensive the eventual clean-up and removal process becomes. Similarly, while rule-based detection of ransomware is normally available within a few days of an outbreak, this is often too late to avoid significant brand and financial damage for many organizations.
These analytics, along with many more, are enabled through Azure Security Center
Staying on top of new technologies is always important, but what skills will help an IT professional become more influential in the workplace? And what role does certification play?
Recent findings from a Microsoft-sponsored IDC white paper, “Cloud Skills and Organizational Influence: How Cloud Skills Are Accelerating the Careers of IT Professionals,” found that even though 70 percent of CIOs surveyed identify themselves as having a “cloud-first IT strategy,” only 16 percent of companies have the IT skills to carry out this strategy. This translates into demand for people with the expertise to thrive in a cloud-first environment.
As companies continue moving to the cloud, certain skills will be especially relevant:
Business intelligence: Businesses thrive on data, and there is a continued demand for IT pros who can make it easier to turn that data into actionable intelligence. DevOps: Companies value the IT operations specialist who can balance the need for agile design and development and the protection of its intellectual property and business goals. Identity and access management: Security threats continue to evolve and there’s a constant need for individuals who understand the threat landscape and who can develop and deploy solutions to protect assets, while maintaining employee
This blog post is co-authored by Joachim Hafner, Cloud Solution Architect
We’re pleased to offer a free e-book for those of you who are new to Azure or in the beginning stages of planning a cloud migration: the Azure Strategy and Implementation Guide for IT Organizations. As cloud solution architects, we hear a lot of the same questions from customers who are starting to think about their cloud implementation process. This motivated us to write this e-book, which provides guidance in the form of a blueprint which customers can follow to help form their cloud strategy.
Whether you are starting with Azure or doing more general research regarding how IT teams navigate cloud implementation, this guide offers a balance of broadly applicable advice and Azure specifics for you to consider. Here is an overview of what is covered:
Chapter 1: Governance – This chapter covers the starting points, from the aspirational “digital transformation” to the important tactical steps of administration and resource naming conventions. Get an overview of topics such as envisioning, to cloud readiness, administration, and security standards and policy. Chapter 2: Architecture – This section takes a longer look at security, touches on cloud design patterns, and
Today, we are excited to announce the support for backup of large disk VMs and set of improvements aimed at reducing the time taken for backup and restore. These set of improvements and large disk support is based on a new VM backup stack and are available for both managed and unmanaged disks. You can seamlessly upgrade to this new stack without any impact to your on-going backup jobs and there is no change to how you setup backup or restore.
This announcement combines multiple feature improvements:
Large disk support – Now you can backup VMs with disk sizes up to 4TB(4095GB), both managed and unmanaged. Instant recovery point – A recovery point is available as soon as the snapshot is done as part of the backup job. This eliminates the need to wait to trigger restore till data transfer phase of the backup is completed. This is particularly useful in scenarios where you want to apply a patch. Now you can go ahead with the patch once the snapshot phase is done and you can use the local snapshot to revert back if the patch goes bad. This is analogous to checkpoint solution offered by Hyper-V or VMware with
https://powerbi.microsoft.com/en-us/blog/week-of-3-13-two-great-webinars-data-storytelling-with-your-data-by-tristan-malherbe-and-driving-business-intelligence-with-the-visio-custom-visual-for-power-bi/Source: https://powerbi.microsoft.com/en-us/blog/week-of-3-13-two-great-webinars-data-storytelling-with-your-data-by-tristan-malherbe-and-driving-business-intelligence-with-the-visio-custom-visual-for-power-bi/ This week we have two great webinars for you! 3/13 Webinar: Data Storytelling with Power BI by Tristan Malherbe 3/15/2018: Drive collaboration and business intelligence using Visio custom Visual for Power BI
There is a new urgency for reaching oil more efficiently in a capital and risk intensive environment, especially with narrow margins around non-traditional exploration. The cost of offshore drilling for oil could be several hundred million dollars, with no guarantee of finding oil at all. On top of that, the high cost of data acquisition, drilling, and production reduces average profit margins to less than ten percent. Also, the expense and strict time limits of petroleum licenses impose a fixed time for exploration. This limit requires data acquisition, data processing, and interpretation of 3-D images with a limited amount of time to a solution envelope.
High performance computing (HPC) helps oil and gas companies accelerate ROI and minimize risk. This is done by providing engineers and geoscientists engaged in identifying and analyzing resource with the potential to map crucial project decisions. Azure provides true HPC on the cloud for customers in the oil and gas industry. Azure provides a broad range of compute resources to meet the needs of oil and gas workloads. This ranges from single-node jobs that use our compute optimized F-series virtual machines to tightly coupled many-node jobs that run on the H-series virtual machines, and all
This post was authored by Mary Wahl, Kolya Malkin, Siyu Yang, Patrick Flickinger, Wee Hyong Tok, Lucas Joppa, and Nebojsa Jojic, representing the Microsoft Research and AI for Earth teams.
Last week Microsoft launched the Geo AI Data Science Virtual Machine (DSVM), an Azure VM type specially tailored to data scientists and analysts that manage geospatial data. To support the Geo AI DSVM launch, we are sharing sample code and methods for our joint land cover mapping project with the Chesapeake Conservancy and ESRI. We have used Microsoft’s Cognitive Toolkit (CNTK) to train a deep neural network-based semantic segmentation model that assigns land cover labels from aerial imagery. By reducing cost and speeding up land cover map construction, such models will enable finer-resolution timecourses to track processes like deforestation and urbanization. This blog post describes the motivation behind our work and the approach we’ve taken to land cover mapping. If you prefer to get started right away, please head straight to our GitHub repository to find our instructions and materials.
Motivation for the land cover mapping use case
The Chesapeake Conservancy is a non-profit organization charged with monitoring natural resources in the Chesapeake Bay watershed, a >165,000 square kilometer region
Last week, it became even more evident that Azure is the best place for all of your applications and data with numerous announcements about investments that dramatically expand the choice and ROI of moving your SQL Server and open source applications to Azure. But these investments aren’t just about Azure. They deliver deeper platform consistency across on-premises and cloud with a rich open source application framework and database support, and expanded cost-savings for Microsoft customers.
Now in preview
Making Azure the best place for all your applications and data – SQL Server customers can now try the preview for SQL Database Managed Instance (see: Migrate your databases to a fully managed service with Azure SQL Database Managed Instance), an Azure Hybrid Benefit for SQL Server license benefit, Azure Database Migration Service preview for Managed Instance, and the preview for Apache Tomcat® support in Azure App Service. This post also forecasts the general availability of Azure Database for MySQL and PostgreSQL in the coming weeks, making even easier to bring your open source powered applications to Azure.
Introducing SQL Information Protection for Azure SQL Database and on-premises SQL Server! – SQL Information Protection (SQL IP) introduces a set of advanced services
This blog post is co-authored by Ashish Jhanwar, Data Scientist, Microsoft
Content Moderator is part of Microsoft Cognitive Services allowing businesses to use machine assisted moderation of text, images, and videos that augment human review.
The text moderation capability now includes a new machine-learning based text classification feature which uses a trained model to identify possible abusive, derogatory or discriminatory language such as slang, abbreviated words, offensive, and intentionally misspelled words for review.
In contrast to the existing text moderation service that flags profanity terms, the text classification feature helps detect potentially undesired content that may be deemed as inappropriate depending on context. In addition, to convey the likelihood of each category it may recommend a human review of the content.
The text classification feature is in preview and supports the English language.
How to use
Content Moderator consists of a set of REST APIs. The text moderation API adds an additional request parameter in the form of classify=True. If you specify the parameter as true, and the auto-detected language of your input text is English, the API will output the additional classification insights as shown in the following sections.
If you specify the language as English for non-English text,
If you need to get up to speed on how to work best in a hybrid environment, we’ve collected five of our most-viewed webinars from the past year for your binge-watching pleasure. These on-demand sessions cover a variety of topics essential to individuals who find themselves needing to work on their hybrid mixed on-premise and cloud strategy:
1. Delivering innovation with the right hybrid cloud solution
If you don’t yet have a plan to take control of your mixed on-premises and cloud infrastructure, simplifying each user’s identity to a single credential can help you easily manage and secure services and create a data platform that simplifies security.
The on-demand delivering innovation with the right hybrid cloud solution webinar covers how to:
Consolidate identities and create a consistent data infrastructure. Unify development with help from Azure Stack application patterns. Select and use the best data platform, no matter your cloud model. 2. Migrating to a hybrid cloud environment: An insider’s walkthrough of 3 key methods
Your company has created a cloud strategy, or maybe you are playing catch-up to an employee-driven move to the cloud. Completing that move in an orderly way is a top priority.