07

Jun

Process more files than ever and use Parquet with Azure Data Lake Analytics

Azure Data Lake Analytics (ADLA) is a serverless PaaS service in Azure to prepare and transform large amounts of data stored in Azure Data Lake Store or Azure Blob Storage at unparalleled scale.

ADLA now offers some new, unparalleled capabilities for processing files of any formats including Parquet at tremendous scale.

Previously: Handling tens of thousands of files is painful!

Many of our customers tell us that handling a large number of files is challenging – if not downright painful in all the big data systems that they have tried. Figure 1 shows the distribution of files in common data lake systems. Most files are less than one GB, although a few may be huge.

Figure 1: The pain of many small files

ADLA has been developed from a system that was originally designed to operate on very large files that have internal structure that help with scale-out, but it only operated on a couple of hundred to about 3,000 files. It also over-allocated resources when processing small files by giving one extract vertex to a file (a vertex is a compute container that will execute a specific part of the script on a partition of the data and

Share

07

Jun

8 reasons to choose Azure Stream Analytics for real-time data processing

Processing big data in real-time is now an operational necessity for many businesses. Azure Stream Analytics is Microsoft’s serverless real-time analytics offering for complex event processing. It enables customers to unlock valuable insights and gain competitive advantage by harnessing the power of big data. Here are eight reasons why you should choose ASA for real-time analytics.

1. Fully integrated with Azure ecosystem: Build powerful pipelines with few clicks

Whether you have millions of IoT devices streaming data to Azure IoT Hub or have apps sending critical telemetry events to Azure Event Hubs, it only takes a few clicks to connect multiple sources and sinks to create an end-to-end pipeline. Azure Stream Analytics provides best-in-class integration to store your output, like Azure SQL Database, Azure Cosmos DB, Azure Data Lake Store. It also enables you to trigger custom workflows downstream with Azure Functions, Azure Service Bus Queues, Azure Service Bus Topics, or create real-time dashboards using Power BI.

2. Developer productivity

One of the biggest advantages of Stream Analytics is the simple SQL-based query language with its powerful temporal constraints to analyze data in motion. Familiarity with SQL language is sufficient to author powerful queries. Azure Stream Analytics supports language extensibility

Share

07

Jun

Using STMicroelectronics starter kits to connect to Azure IoT in minutes

Microsoft partners with silicon vendors such as STMicroelectronics to simplify and accelerate the development of embedded systems, so our customers can move projects from proof of concepts to production faster. One of the most common issues in IoT project development is the passage from proof of concept to production, from a handful of devices to deployment and management of devices at an IoT scale and from development hardware to mass produced silicon.

STMicroelectronics offers a wide range of IoT hardware along with pre-integrated software, a powerful development ecosystem and valuable starter kits. With these, connecting to Azure IoT Hub the cloud platform, monitor, and manage billions of IoT assets using one of the Microsoft Azure Certified ST devices takes minutes and you don’t have to write any code! This magic is possible because of the integration ST provides under the cover. For example, STM32 IoT Discovery Kit Node is an Arm® Cortex®-M4-core-based developer kit and sporting a full set of low power wireless connectivity options and environmental, motion and ranging sensors. FP-CLD-AZURE1 is an STM32Cube function pack that ST developed for this kit. Azure IoT C SDK is integrated into the middleware of this function pack, which enables direct and

Share

06

Jun

Drill Through and Report Server configuration with MDM/EMM tools

https://powerbi.microsoft.com/en-us/blog/drill-through-and-report-server-configuration-with-mdm-tools/Source: https://powerbi.microsoft.com/en-us/blog/drill-through-and-report-server-configuration-with-mdm-tools/           Recently Power BI Mobile & Devices team have released some new cool features. These features address some of the most requested items we got from you – we hear you!

Share

06

Jun

Power BI Service and Mobile May Feature Summary
Power BI Service and Mobile May Feature Summary

https://powerbi.microsoft.com/en-us/blog/power-bi-service-and-mobile-may-2018-feature-summary/Source: https://powerbi.microsoft.com/en-us/blog/power-bi-service-and-mobile-may-2018-feature-summary/           We’re almost halfway through the year and what a great start it’s been to 2018! The Power BI team has worked in full force over the last few months to deliver the features READ MORE

Share

06

Jun

Azure Data Lake Tools for VSCode supports Azure blob storage integration

We are pleased to announce the integration of VSCode explorer with Azure blob storage. If you are a data scientist and want to explore the data in your Azure blob storage, please try the Data Lake Explorer blob storage integration. If you are a developer and want to access and manage your Azure blob storage files, please try the Data Lake Explorer blob storage integration. The Data Lake Explorer allows you easily navigate to your blob storage, access and manage your blob container, folder and files.  

Summary of new features

Blob container – Refresh, Delete Blob Container and Upload Blob 

Folder in blob – Refresh and Upload Blob 

File in blob – Preview/Edit, Download, Delete, Create EXTRACT Script (only available for CSV, TSV and TXT files), as well as Copy Relative Path, and Copy Full Path

How to install or update

Install Visual Studio Code and download Mono 4.2.x (for Linux and Mac). Then get the latest Azure Data Lake Tools by going to the VSCode Extension repository or the VSCode Marketplace and searching Azure Data Lake Tools.

For more information about Azure Data Lake Tool for VSCode, please use

Share

06

Jun

Detecting script-based attacks on Linux
Detecting script-based attacks on Linux

Last month, we announced the extension of Azure Security Center’s detection for Linux. This post aims to demonstrate how existing Windows detections often have Linux analogs. A specific example of this is the encoding or obfuscation of command-lines. Some of the reasons an attacker might wish to encode their commands include minimizing quoting/escaping issues when encapsulating commands in scripts and a basic means of hiding from host-based intrusion detection. These techniques have the additional benefit of avoiding the need to drop a file to disk, reducing the risk to an attacker of being detected by traditional anti-virus products.

Encoded PowerShell attacks on Windows

There are many examples of such behavior being used in attacks against Windows environments. A previous blog post highlights one such technique to encode PowerShell commands as base64. PowerShell actually makes this amazingly easy, allowing commands of the form:

powershell.exe -EncodedCommand dwByAGkAdABlAC0AbwB1AHQAcAB1AHQAIABFAG4AYwBvAGQAZQBkACAAUABvAHcAZQByAFMAaABlAGwAbAAgAHMAYwByAGkAcAB0AA==

The only real stumbling block being the requirement that the decoded command must be UTF-16 (hence the prevalence of ‘A’ in the resulting base64).

Encoded shell attacks on Linux

As with attacks on Windows systems, the same motivations exist for encoding commands on Linux systems. Namely, to avoid encapsulation issues with special characters and

Share

06

Jun

Azure Search is now certified for several levels of compliance

Compliance is an important factor for customers when looking at software and services as they look to meet their own compliance obligations across regulated industries and markets worldwide. For example, ISO 27001 certification is a security standard that provides a baseline set of requirements for many other international standards and regulations and HIPAA (Health Insurance Portability and Accountability Act) is a US law that establishes requirements for the use, disclosure, and safeguarding of protected health information (PHI).

For that reason, we are excited to announce that Azure Search has been certified for several levels of compliance including:

ISO 27001:2013 SOC 2 Type 2 GxP (21 CFR Part 11) HIPAA and the HITECH Act HITRUST PCI DSS Level 1 Australia IRAP Unclassified

With these certifications and attestations, we hope to enable Azure Search as a viable option for customers looking to meet and attain key international and industry-specific compliance standards within their solutions.

Azure compliance offerings are grouped into four segments: globally applicable, US government, industry specific, and region/country specific. To view an overview of Azure Search as well as other Microsoft Azure compliance offerings, please visit the Microsoft Trust Center. In addition, you can directly download a document that provides

Share

06

Jun

Mellanox uses Azure to accelerate network design
Mellanox uses Azure to accelerate network design

What do you do when your networking powers 60 percent of the HPC systems on the Top500 supercomputer list and you have 70 percent market share for 25G and faster network adapters? You continue to push the boundaries of performance to keep your position. But developing high-speed, low-latency networking gear is not an easy process, and you need to have an efficient IT infrastructure to hold your mission.

Journey to the cloud

Mellanox’s journey to running HPC workloads in Azure started a few years ago. At first, they were looking for services like disaster recovery, where they could keep the environment deprovisioned most of the time. Then they started looking at moving services like backups to the cloud. Blob Storage was more attractive to them than managing tape libraries. As they gained comfort with that, they began moving additional services like email and SharePoint.

Mellanox’s users became comfortable using routine services in Azure the performance and stability were attractive, and it allowed IT teams to focus on the areas that add value. When it came time to look at bursting the design environment, Mellanox looked into public cloud options.

Mellanox worked with Univa, a leading provider of HPC scheduling and

Share

05

Jun

Power BI Desktop June Feature Summary
Power BI Desktop June Feature Summary

https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-june-2018-feature-summary/Source: https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-june-2018-feature-summary/           This month our reporting features focus on accessibility and more flexibility when formatting your charts. With filtering and sorting options now available, out data view is now more powerful. Several connectors also get READ MORE

Share