Author : All posts by ilikesql

23

May

5/24 Webinar: Time intelligence for retail and wholesale industries with Power BI by Matt Allington

https://powerbi.microsoft.com/en-us/blog/5-24-webinar-time-intelligence-for-retail-and-wholesale-industries-with-power-bi-by-matt-allington/Source: https://powerbi.microsoft.com/en-us/blog/5-24-webinar-time-intelligence-for-retail-and-wholesale-industries-with-power-bi-by-matt-allington/           Power Bi Webinar, focusing on how time intelligence can be used for analysis in retail and wholesale industries

23

May

Azure AD Authentication for Azure Storage now in public preview

We are excited to announce the preview of Azure AD Authentication for Azure Blobs and Queues. This capability is one of the features most requested by enterprise customers looking to simplify how they control access to their data as part of their security or compliance needs. This capability is available in all public regions of Azure.

Azure Storage supports several mechanisms that give you flexibility to control who can access your data, as well as how, when, and from where they can access it. With AAD authentication, customers can now use Azure’s role-based access control framework to grant specific permissions to users, groups and applications down to the scope of an individual blob container or queue. This capability extends the existing Shared Key and SAS Tokens authorization mechanisms which continue to be available.

Developers can also leverage Managed Service Identity (MSI) to give Azure resources (Virtual Machines, Function Apps, Virtual Machine Scale Set etc.) an automatically managed identity in Azure AD. Administrators can assign roles to these identities and run applications securely, without having any credentials in your code.

Administrators can grant permissions and use AAD Authentication with any Azure Resource Manager storage account using the Azure

23

May

Create enterprise subscription experience in Azure portal public preview

Typically, Azure Enterprise Agreement (EA) subscriptions were created in the EA Portal and management of services was completed in the Azure portal. Our goal is to converge on the Azure portal as the primary avenue for users to manage their Azure services and subscriptions.

We are making available the public preview of the create subscription experience in the Azure portal. This capability will directly align with the the ability to create multiple enterprise subscriptions using the Create Subscription API. This experience is fully integrated in the Azure portal and will enable you to quickly get an EA subscription created without any programming.

Getting started

The following steps only apply to EA and EA Dev/Test subscriptions. The majority of users will be able to access the user experience below. There will be some users who do not meet the prerequisites to create a subscription in the Azure portal. For those users, the “+Add” button will open a separate window to create new subscriptions.

The steps for using the create enterprise subscription experience in the Azure portal are as follows:

If you are not an account owner, get added by an EA enrollment admin. Navigate to the Subscriptions extension in the Azure

23

May

Serverless real-time notifications in Azure using Azure #CosmosDB

There were lots of announcements at the Microsoft Build 2018 conference, but one that caught my eye was the preview release of Azure SignalR, a Platform-as-a-Service (PaaS) offering that lets you implement real-time messages and notifications quite easily, without worrying about instances or hosts.

So it made me wonder, could I build something using my favorite globally-distributed and serverless database, Azure Cosmos DB, and Azure’s serverless compute offering, Azure Functions? It turns out others were interested in this topic too.

Real-time, really?

For those of you that do not know, SignalR is a library that’s been around since 2013 for the ASP.NET Framework, recently rewritten for ASP.NET Core under the name of SignalR Core, that allows you to easily create real-time applications and push content to clients through the Websocket protocol, gracefully falling back to other alternatives depending on the client. It works great for games, dashboards/monitoring apps, collaborative apps, mapping/tracking apps, or any app requiring notifications.

By leveraging the Websocket protocol, content can be pushed to clients without the overhead of opening multiple HTTP connections and over a single two-way TCP channel that is maintained for the entire session.

Going serverless!

One requirement of SignalR was that you,

23

May

An update on the integration of Avere Systems into the Azure family

It has been three months since we closed on the acquisition of Avere Systems. Since that time, we’ve been hard at work integrating the Avere and Microsoft families, growing our presence in Pittsburgh and meeting with customers and partners at The National Association of Broadcasters Show.

It’s been exciting to hear how Avere has helped businesses address a broad range of compute and data challenges, helping produce blockbuster movies and life-saving drug therapies faster than ever before with hybrid and public cloud options. I’ve also appreciated having the opportunity to address our customers questions and concerns and thought it might be helpful to share the most common ones with the broader Azure/Avere community:

When will Avere be available on Microsoft Azure? We are on track to release Microsoft Avere vFXT to the Azure Marketplace later this year.  With this technology Azure customers will be able to run compute-intensive applications completely on Azure or to take advantage of our scale on an as-needed basis. Will Microsoft continue to support the Avere FXT physical appliance? Yes, we will continue to invest in, upgrade and support the Microsoft Avere FXT physical appliance, which customers tell us is particularly important for their on-premise and hybrid environments.

23

May

Load confidently with SQL Data Warehouse PolyBase Rejected Row Location

Every row of your data is an insight waiting to be found. That is why it is critical you can get every row loaded into your data warehouse. When the data is clean, loading data into Azure SQL Data Warehouse is easy using PolyBase. It is elastic, globally available, and leverages Massively Parallel Processing (MPP). In reality clean data is a luxury that is always available. In those cases you need to know which rows failed to load and why.

In Azure SQL Data Warehouse the Create External Table definition has been extended to include a Rejected_Row_Location parameter. This value represents the location in the External Data Source where the Error File(s) and Rejected Row(s) will be written.

CREATE EXTERNAL TABLE [dbo].[Reject_Example] ( [Col_one] TINYINT NULL, [Col_two] VARCHAR(100) NULL, [Col_three] NUMERIC(2,2) NULL ) WITH ( DATA_SOURCE = EDS_Reject_Row ,LOCATION = ‘Read_Directory’ ,FILE_FORMAT = CSV ,REJECT_TYPE = VALUE ,REJECT_VALUE = 100 ,REJECTED_ROW_LOCATION=‘Reject_Directory’ ) What happens when data is loaded?

When a user runs a Create Table as Select (CTAS) on the table above, PolyBase creates a directory on the External Data Source at the Rejected_Row_Location if one doesn’t exist. A child directory is created with the name “_rejectedrows”. The “_”

23

May

Do more with Chef and Microsoft Azure
Do more with Chef and Microsoft Azure

We’re committed to making Azure work great with the open source tools you know and love, and if you’re using Chef products or open source projects, there’s never been a better time to try Azure. We’ve had a rich history of partnership and collaboration with Chef to deliver automation tools that help you with cloud adoption. Today, at ChefConf, the Chef and Azure teams are excited to announce the inclusion of Chef InSpec, directly in Azure Cloud Shell, as well as the new Chef Developer Hub in Azure Docs.

Inspec in Azure Cloud Shell

In addition to other open source tools like Ansible and Terraform that are already available, today we are announcing the availability of Chef Inspec, pre-installed and ready to use for every Azure user in the Azure Cloud Shell. This makes bringing your Inspec tests to Azure super-simple, in fact it’s the easiest way to try out Inspec – no installation or configuration required.

 

Figure 1: InSpec Exec within Azure Cloud Shell

Chef Developer Hub for Azure

We are launching the new Chef Developer Hub so Azure customers can more easily implement their solutions using Chef open source software. Whether you’re using Chef, Inspec or

23

May

On-premises data gateway May update is now available
On-premises data gateway May update is now available

https://powerbi.microsoft.com/en-us/blog/on-premises-data-gateway-may-update-is-now-available/Source: https://powerbi.microsoft.com/en-us/blog/on-premises-data-gateway-may-update-is-now-available/           We are excited to announce that we have released the May update for the On-premises data gateway. Here are some of the things that we would like to highlight with this month’s release READ MORE

23

May

Control Azure Data Lake costs using Log Analytics to create service alerts

Azure Data Lake customers use the Data Lake Store and Data Lake Analytics to store and run complex analytics on massive amounts of data. However it is challenging to manage costs, keep up-to-date with activity in the accounts, and proactively know when usage thresholds are nearing certain limits. Using Log Analytics and Azure Data Lake we can address these challenges and know when the costs are increasing or when certain activities take place.

In this post, you will learn how to use Log Analytics with your Data Lake accounts to create alerts that can notify you of Data Lake activity events and when certain usage thresholds are reached. It is easy to get started!

Step 1: Connect Azure Data Lake and Log Analytics

Data Lake accounts can be configured to generate diagnostics logs, some of which are automatically generated (e.g. regular Data Lake operations such as reporting current usage, or whenever a job completes). Others are generated based on requests (e.g. when a new file is created, opened, or when a job is submitted). Both Data Lake Analytics and Data Lake Store can be configured to send these diagnostics logs to a Log Analytics account where we can query

22

May

Accelerate data warehouse modernization with Informatica Intelligent Cloud Services for Azure

Today at the Informatica World, Scott Guthrie, EVP, Cloud + AI, along with Anil Chakravarthy, CEO of Informatica, announced the availability of Informatica Intelligent Cloud Services (IICS) for Azure. Microsoft has partnered with Informatica, a leader in Enterprise Data Management, to help our customers accelerate data warehouse modernization. This service is available as a free preview on Azure today.

Informatica provides a discovery-driven approach to data warehouse migration. This approach simplifies the process of identifying and moving data into Azure SQL Data Warehouse (SQL DW), Microsoft’s petabyte scale, fully managed, globally available analytics platform. With the recently released SQL DW Compute Optimized Gen2 tier, you can enjoy 5x performance, 4x concurrency and 5x scale from previous generation.

With this release, Informatica Intelligent Cloud Services for Azure can be launched directly from the Azure Portal. You can enjoy a single sign-on experience and don’t have to create a separate Informatica account. With Informatica Data Accelerator for Azure, you can discover and load data into SQL DW. Informatica’s discovery-driven approach allows you to work with thousands of tables and columns.

“We are very excited about this next step in our long-standing partnership with Microsoft”, said Pratik Parekh, VP, Product Management, Informatica.