Author : All posts by ilikesql

10

Apr

Achieving GDPR compliance in the cloud with Microsoft Azure

The General Data Protection Regulation (GDPR) officially goes into effect on May 25. Will your organization be ready?

Very soon, the GDPR will replace the Data Protection Directive as the new global standard on data privacy for all government agencies and organizations that do business with European Union (EU) citizens. When it does, all organizations that control, maintain, or process information involving EU citizens will be required to comply with strict new rules regarding the protection of personal customer data. For companies that store and manage data in the cloud, assuming existing infrastructure will remain compliant with new regulatory requirements might result in significant fines.

It’s important to understand that the differences between the new GDPR and the Data Protection Directive could impact your cloud data and security controls. For example, GDPR’s broad interpretation of what constitutes personal information leaves relevant agencies and organizations responsible for providing “reasonable” protection for a wider range of data types, including genetic and biometric data. More than ever, this regulatory transition highlights the importance of implementing a comprehensive cloud security strategy for your company.

According to a recent GDPR benchmarking survey, although 89 percent of organizations have (or plan to have) a formal

Share

10

Apr

Three critical analytics use cases with Microsoft Azure Databricks

Data science and machine learning can be applied to solve many common business scenarios, yet there are many barriers preventing organizations from adopting them. Collaboration between data scientists, data engineers, and business analysts and curating data, structured and unstructured, from disparate sources are two examples of such barriers – and we haven’t even gotten to the complexity involved when trying to do these things with large volumes of data.  

Recommendation engines, clickstream analytics, and intrusion detection are common scenarios that many organizations are solving across multiple industries. They require machine learning, streaming analytics, and utilize massive amounts of data processing that can be difficult to scale without the right tools. Companies like Lennox International, E.ON, and renewables.AI are just a few examples of organizations that have deployed Apache Spark™ to solve these challenges using Microsoft Azure Databricks.

Your company can enable data science with high-performance analytics too. Designed in collaboration with the original creators of Apache Spark, Azure Databricks is a fast, easy, and collaborative Apache Spark™ based analytics platform optimized for Azure. Azure Databricks is integrated with Azure through one-click setup and provides streamlined workflows, and an interactive workspace that enables collaboration between data scientists, data engineers, and business

Share

10

Apr

Announcing larger, higher scale storage accounts

One of the fastest areas of growth in cloud computing is around data storage. With a variety of workloads such as IoT telemetry, logging, media, genomics and archival driving cloud data growth, the need for scalable capacity, bandwidth, and transactions for storing and analyzing data for business insights, is more important than ever.

Up to 10x increase to Blob storage account scalability

We are excited to announce improvements in the capacity and scalability of standard Azure storage accounts, which greatly improves your experience building cloud-scale applications using Azure Storage. Effective immediately, via a request made to Azure Support, Azure Blob storage accounts or General Purpose v2 storage accounts can support the following larger limits. The defaults remain the same as before.

Resource

Default

New Limit

Max capacity for Blob storage accounts

500 TB

5PB (10x increase)

Max TPS/IOPS for Blob storage accounts

20K

50K (2.5x increase)

Max ingress for Blob storage accounts

5-20 Gbps (varies by region/ redundancy type)

50Gbps (up to 10x increase)

Max egress for Blob storage accounts

10-30 Gbps (varies

Share

09

Apr

Celebrating data and you with #SQLDataWins
Celebrating data and you with #SQLDataWins

Hey you, data whiz! You’re an expert, a data aficionado, a steward of a finely tuned database – but not everyone gets it. Well, we do.

#SQLDataWins celebrates professionals who make an impact with data every day. Play along with us on Twitter by responding to our weekly tweets. It’s that simple.

Every Monday at 9:00am PT @SQLServer will tweet out a fun data-related scenario. We want to see your reaction in the form of a GIF, because, who doesn’t love a good GIF? Reply directly to the weekly tweet by Thursday at 11:59pm PT with your favorite GIF and include the hashtags #Sweepstakes #SQLDataWins in your tweet and you’ll be entered to win* an awesome SQL Server prize pack! A couple pro tips: You must be a legal resident of the 50 United States, and be sure to come back to @SQLServer every week for a brand-new scenario and another chance to win.

Looking to brush up on all things SQL Server and keep on winning with data? Check out these helpful resources to keep your data game on point.

FAQs

How do I enter?

Visit twitter.com/sqlserver Find the current week’s #SQLDataWins scenario tweet that’s pinned to the top

Share

09

Apr

Celebrating data and you with #SQLDataWins
Celebrating data and you with #SQLDataWins

Hey you, data whiz! Youre an expert, a data aficionado, a steward of a finely tuned database but not everyone gets it. Well, we do.

#SQLDataWins celebrates professionals who make an impact with data every day. Play along with us on Twitter by responding to our weekly tweets. Its that simple.

Every Monday at 9:00am PT @SQLServer will tweet out a fun data-related scenario. We want to see your reaction in the form of a GIF, because, who doesnt love a good GIF? Reply directly to the weekly tweet by Thursday at 11:59pm PT with your favorite GIF and include the hashtags #Sweepstakes #SQLDataWins in your tweet and youll be entered to win* an awesome SQL Server prize pack! A couple pro tips: You must be a legal resident of the 50 United States, and be sure to come back to @SQLServer every week for a brand-new scenario and another chance to win.

Looking to brush up on all things SQL Server and keep on winning with data? Check out these helpful resources to keep your data game on point.

FAQs

How do I enter?

Visit twitter.com/sqlserver Find the current weeks #SQLDataWins scenario tweet thats pinned to

Share

09

Apr

Azure.Source – Volume 26
Azure.Source – Volume 26

Last week, Azure expanded its global footprint with new regions available for Australia and New Zealand. In addition, updates were made to reduce your Azure cost, such as flexible new ways to purchase Azure SQL Database, a new inexpensive pricing tier in Azure IoT Hub for device to cloud telemetry, and a new pricing model for Azure monitoring services.

Now in preview

A flexible new way to purchase Azure SQL Database – Recently announced with SQL Database Managed Instance, the vCore-based purchasing model reflects our commitment to customer choice by providing flexibility, control, and transparency. As with Managed Instance, the vCore-based model makes the Elastic Pool and Single Database options eligible for up to 30 percent savings with the Azure Hybrid Benefit for SQL Server.

SQL Database: Long-term backup retention preview includes major updates – The preview of long-term backup retention in Azure SQL Database, announced in October 2016, has been updated with a set of major enhancements based on customer feedback requesting more regional support, more flexible backup policies, management of individual backups, and streamlined configuration.

Also in preview

Public preview: Read scale-out support for Azure SQL Database

Now generally available

Application Security Groups now generally available in all

Share

09

Apr

From Microsoft Azure to everyone attending NAB Show 2018 — Welcome!

This blog post was authored by Tad Brockway, General Manager, Azure Storage and Azure Stack.

NAB Show is one of my favorites. The creativity, technology, content and storytelling are epic, as is the digital transformation well underway.

This transformation — driven by new businesses models, shifting consumption patterns and technology advancements — will change this great industry. What won’t change is the focus on creators and connecting their content with consumers around the world so they, too, can be a part of the story.

Microsoft’s mission is to “empower every person and every organization on the planet to achieve more.” We are committed to helping everyone in the industry — customers like Endemol Shine, UFA, Jellyfish and Reuters — do just that. With innovations across cloud storage, compute, CDN, Media Services and new investments in Avere Systems and Cycle Cloud, Microsoft Azure is ready to help modernize your media workflows and your business.

How? Well, queue scene…

Your productions are increasingly global and demanding. Azure can help.

Creators, media artists and innovators want to work together in a flexible, secure and collaborative way from wherever they are. More datacenters, in more regions of the world, than all competitors combined means

Share

09

Apr

A years’ worth of cloud, AI and partner innovation. Welcome to NAB 2018!

As I reflect on cloud computing and the media industry since last year’s NAB, I see two emerging trends. First, content creators and broadcasters such as Rakuten, RTL, and Al Jazeera are increasingly using the global reach, hybrid model, and elastic scale of Azure to create, manage, and distribute their content. Second, AI-powered tools for extracting insights from content are becoming an integral part of the content creation, management and distribution workflows with customers such as Endemol Shine Group, and Zone TV.

Therefore, at this year’s NAB, we are focused on helping you modernize your media workflows, so you can get the best of cloud computing and AI.  We made a number of investments to enable better content production workflows in Azure, including the recent acquisition of Avere Systems. You can learn more about how Azure can help you improve your media workflows and business here.

Read on to learn more about the key advancements we’ve made – in media services, distribution and our partner ecosystem – since last year’s IBC.

Azure Media Services

Democratizing AI for Media Industry: Since its launch at NAB 2016, Azure Media Analytics has come a long way. At Build 2017, we launched Video Indexer,

Share

09

Apr

How to configure Azure SQL Database Geo-DR with Azure Key Vault

Azure SQL Database and Data Warehouse offer encryption-at-rest by providing Transparent Data Encryption (TDE) for all data written to disk, including databases, log files and backups. This protects data in case of unauthorized access to hardware. TDE provides a TDE Protector that is used to encrypt the Database Encryption Key (DEK), which in turn is used to encrypt the data. With the TDE and Bring Your Own Key (BYOK) offering currently in preview, customers can take control of the TDE Protector in Azure Key Vault.

Taking advantage of TDE with BYOK for databases that are geo-replicated to maintain high availability requires to configure and test the scenario carefully. This post will go over the most common configuration options.

To avoid creating a single point of failure in active geo-replicated instances or SQL failover groups, it is required to configure redundant Azure Key Vaults. Each geo-replicated server requires a separate key vault, that must be co-located with the server in the same Azure region. Should a primary database become inaccessible due to an outage in one region and a failover is triggered, the secondary database is able to take over using the secondary key vault.

For Geo-Replicated Azure SQL databases, the

Share

09

Apr

Seamlessly upgrade Azure SQL Data Warehouse for greater performance and scalability

Azure SQL Data Warehouse recently announced the preview release of the Optimized for Compute performance tier providing customers with an enhanced offering of the service. With this major release, the service now has a 5X increase in compute scalability and unlimited storage for columnar data. Along with the increased capacity, customers are realizing an average increase of 5X in performance for query workloads. For existing Optimized for Elasticity customers wanting to capitalize on these benefits, there is now an option to seamlessly upgrade via the Azure Portal. The easy to use upgrade experience via the Azure Portal has no downtime associated with exporting and reimporting of the data.

Upgrade to optimize for performance

You can now upgrade to the latest performance tier within the Azure Portal. This will result in no change to your connection string details:

To learn more about the upgrade process, visit our upgrade documentation. If you need help for a POC, contact us directly. Stay up-to-date on the latest Azure SQL DW news and features by following us on Twitter @AzureSQLDW.

Share