Category Archives : Updates

22

Apr

Propel your IoT platform to the cloud with Azure Time Series Insights!

Today we’re pleased to announce two key capabilities that Azure Time Series Insights will be delivering later this year:

A cost-effective long-term storage that enables a cloud-based solution to trend years’ worth of time series data pivoted on devices/tags.  A device-based (also known industry-wide as “tag-based”) user experience backed by a time series model to contextualize raw time series data with device metadata and domain hierarchies.

Additionally, Time Series Insights will be integrating with advanced machine learning and analytics tools like Spark and Jupyter notebooks to help customers tackle time series data challenges in new ways. Data scientists and process engineers in industries like oil & gas, power & utility, manufacturing, and building management rely on time series data solutions for critical tasks like storage, data analysis, and KPI tracking and they’ll be able to do this using Time Series Insights . 

Time series model and tag-centric experience

Time Series Insights’ current user interface is great for data scientists and analysts. However, process engineers and asset operators may not always find this experience natural to use. To address this, we are adding a device-based user experience to the Time Series Insights explorer. This new interface and the underlying time series

Share

17

Apr

Transparent Data Encryption with customer managed keys in Azure SQL Database generally available

Today, we are excited to announce the general availability of Transparent Data Encryption (TDE) with Bring Your Own Key (BYOK) support for Azure SQL Database and Azure SQL Data Warehouse. This is one of the most frequently requested features by enterprise customers looking to protect sensitive data and meet regulatory or compliance obligations that require implementation of specific key management controls. TDE with BYOK support is offered in addition to TDE with service managed keys, which is enabled by default on all new Azure SQL Databases.

TDE with BYOK support uses Azure Key Vault, which provides highly available and scalable secure storage for RSA cryptographic keys backed by FIPS 140-2 Level 2 validated Hardware Security Modules (HSMs). Key Vault streamlines the key management process and enables customers to maintain full control of encryption keys and allows them to manage and audit key access.

Customers can generate and import their RSA key to Azure Key Vault and use it with Azure SQL Database and Azure SQL Data Warehouse TDE with BYOK support. Azure SQL Database handles the encryption and decryption of data stored in databases, log files, and backups in a fully transparent fashion by using a symmetric Database Encryption Key

Share

17

Apr

Azure Marketplace new offers in March 2018
Azure Marketplace new offers in March 2018

We continue to expand the Azure Marketplace ecosystem. In March 2018, 55 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Kentico on Windows Server 2012 R2: Kentico CMS is a free edition web content management system for building websites, online stores, intranets, and community sites. Create, manage, and integrate communities socially to encourage conversations about your brand.

OpenText Process Suite 16.3 Marketplace Info VM: With intelligently automated, content-rich processes that you can quickly build and easily modify, Process Suite gives you the power to deliver a variety of new digital experiences with a much lower IT workload.

Content Suite 16 (January 2018): OpenText Content Suite Platform is a comprehensive enterprise content management (ECM) system designed to manage the flow of information from capture through archiving and disposition.

BigDL Spark Deep Learning Framework VirtualMachine: Deep Learning framework for distributed computing designed for Apache Spark architecture and highly optimized for Intel Xeon CPUs. Feature-parity with TensorFlow, Caffe, etc., without the need for GPUs.

Gallery Server on Windows Server 2012 R2: Gallery Server is a free, open source, easy-to-use Digital

Share

17

Apr

Recovery Services vault limit increased to 500 vaults per subscription per region

Today, we are delighted to announce increased scale limits for Azure Backup. Users can now create as many as 500 recovery services vaults in each subscription per region as compared to the earlier limit of 25 vaults per region per subscription. Customers who have been hitting the vault limits due to a restriction of 25 vaults can now go ahead and create vaults to manage their resources better. In addition, the number of Azure virtual machines that can be registered against each vault has been increased to 1,000 from the earlier limit of 200 machines under each vault.

Key benefits Better management of resources between departments in an organization: Flexibility to create a large number of vaults under a subscription and large number of containers under a vault based on the departmental requirements without worrying about hitting vault limits. Better granularity in reporting and monitoring of data within vaults: Users can create separate vaults as per their requirements segregated based on organizational needs and get more granular reporting of backup usage on a per vault basis. Systematic and comprehensive billing: Users can get vault level detailed billing for a subscription for better financial management within an organization. Related links and

Share

16

Apr

Iterative development and debugging using Data Factory
Iterative development and debugging using Data Factory

Data Integration is becoming more and more complex as customer requirements and expectations are continuously changing. There is increasingly a need among users to develop and debug their Extract Transform/Load (ETL) and Extract Load/Transform (ELT) workflows iteratively. Now, Azure Data Factory (ADF) visual tools allow you to do iterative development and debugging.

You can create your pipelines and do test runs using the Debug capability in the pipeline canvas without writing a single line of code. You can view the results of your test runs in the Output window of your pipeline canvas. Once your test run succeeds, you can add more activities to your pipeline and continue debugging in an iterative manner. You can also Cancel your test runs once they are in-progress. You are not required to publish your changes to the data factory service before clicking Debug. This is helpful in scenarios where you want to make sure that the new additions or changes work as expected before you update your data factory workflows in dev, test or prod environments.

Data Factory visual tools also allow you to do debugging until a particular activity in your pipeline canvas. Simply put a breakpoint on the activity until

Share

12

Apr

Enhanced capabilities to monitor, manage, and integrate SQL Data Warehouse in the Azure Portal

Azure SQL Data Warehouse (SQL DW) continues to introduce updates to the Azure portal to provide a seamless user experience when monitoring, managing, and integrating your data warehouse.

Support for Azure Monitor metrics

SQL DW now supports Azure Monitor which is a built-in monitoring service that consumes performance and health telemetry for your data warehouse. Azure monitor not only enables you to monitor your data warehouse within the Azure portal, but its tight integration between Azure services also enables you to monitor your entire data analytics solution within a single interface. For this release, data warehouse metrics have been enabled to enables you to identify performance bottlenecks and user activity:

Successful/Failed/Blocked by firewall connections CPU IO DWU Limit DWU Percentage DWU used

These metrics now have a one-minute frequency for near real-time visibility into resource bottlenecks of your data warehouse. There is a default retention period of 90 days for all data warehouse metrics with Azure Monitor.

Configure metric charts in the Azure monitor service through the Azure Portal or programmatically query for metrics via PowerShell or REST:

Pin configured charts for your data warehouse through Azure dashboards:

Safely manage costs by pausing

The pause feature for SQL

Share

09

Apr

Seamlessly upgrade Azure SQL Data Warehouse for greater performance and scalability

Azure SQL Data Warehouse recently announced the preview release of the Optimized for Compute performance tier providing customers with an enhanced offering of the service. With this major release, the service now has a 5X increase in compute scalability and unlimited storage for columnar data. Along with the increased capacity, customers are realizing an average increase of 5X in performance for query workloads. For existing Optimized for Elasticity customers wanting to capitalize on these benefits, there is now an option to seamlessly upgrade via the Azure Portal. The easy to use upgrade experience via the Azure Portal has no downtime associated with exporting and reimporting of the data.

Upgrade to optimize for performance

You can now upgrade to the latest performance tier within the Azure Portal. This will result in no change to your connection string details:

To learn more about the upgrade process, visit our upgrade documentation. If you need help for a POC, contact us directly. Stay up-to-date on the latest Azure SQL DW news and features by following us on Twitter @AzureSQLDW.

Share

09

Apr

Continuous integration and deployment using Data Factory

Azure Data Factory (ADF) visual tools public preview was announced on Jan 16, 2018. With visual tools, you can iteratively build, debug, deploy, operationalize and monitor your big data pipelines. Now, you can follow industry leading best practices to do continuous integration and deployment for your ETL/ELT (extract, transform/load, load/transform) workflows to multiple environments (Dev, Test, PROD etc.). Essentially, you can incorporate the practice of testing for your codebase changes and push the tested changes to a Test or Prod environment automatically.

ADF visual interface now allows you to export any data factory as an ARM (Azure Resource Manager) template. You can click the ‘Export ARM template’ to export the template corresponding to a factory.

This will generate 2 files:

Template file: Template json containing all the data factory metadata (pipelines, datasets etc.) corresponding to your data factory. Configuration file: Contains environment parameters that will be different for each environment (Dev, Test, Prod etc.) like Storage connection, Azure Databricks cluster connection etc..

You will create a separate data factory per environment. You will then use the same template file for each environment and have one configuration file per environment. Clicking the ‘Import ARM Template’ button will take you to

Share

04

Apr

Improvements to SQL Elastic Pool configuration experience

We have made some great improvements to the SQL elastic pool configuration experience in the Azure portal. These changes are released alongside the new vCore-based purchasing model for elastic pools and single databases. Our goal is to simplify your experience configuring elastic pools and ensure you are confident in your configuration choices.

Changing service tiers for existing pools

Existing elastic pools can now be scaled up and down between service tiers. You can easily move between service tiers and discover the one that best fits your business needs. You can also switch between the DTU-based and the new vCore-based service tiers. You can also scale down your pool outside of business hours to save cost.

Simplifying configuration of the pool and its databases

Elastic pools offer many settings for customers to customize. The new experience aims to separate and simplify each aspect of pool management, between the pool settings, database settings, and database management. This enables you to more easily reason over each of these aspects of the pool while being able to save all settings changes in one batch.

Understanding your bill with new cost summary

Our new cost summary experience for elastic pools and single databases

Share

03

Apr

Introducing a new way to purchase Azure monitoring services

Today customers rely on Azure’s application, infrastructure, and network monitoring capabilities to ensure their critical workloads are always up and running. It’s exciting to see the growth of these services and that customers are using multiple monitoring services to get visibility into issues and resolve them faster. To make it even easier to adopt Azure monitoring services, today we are announcing a new consistent purchasing experience across the monitoring services. Three key attributes of this new pricing model are:

1. Consistent pay-as-you-go pricing

We are adopting a simple “pay-as-you-go” model across the complete portfolio of monitoring services. You have full control and transparency, so you pay for only what you use. 

2. Consistent per gigabyte (GB) metering for data ingestion

We are changing the pricing model for data ingestion from “per node” to “per GB”. Customers told us that the value in monitoring came from the amount of data received and the insight built on top of that, rather than the number of nodes. In addition, this new model works best for the future of containers and microservices where the definition of a node is less clear. “Per GB” data ingestion is the new basis for pricing across application, infrastructure,

Share