Today we’re letting our customers know about our upcoming Data Subject Request (DSR) processing capability in the Azure portal, which will provide tenant admins a simple, powerful tool to quickly fulfill the Data Subject Requests that are central to compliance with the European Union General Data Protection Regulation (GDPR). We will fully support these DSR capabilities before May 25, 2018, the date when enforcement of the GDPR begins and when Microsoft has committed to be GDPR compliant across our cloud services.
The GDPR is the most significant change to EU privacy law in two decades and sets a new global standard for privacy rights, governing the handling and use of personal data. A fundamental tenet of the GDPR is the set of rights it grants individuals, or data subjects, in connection with their personal data collected by an organization (known as the data controller).
If your organization collects, hosts, or analyzes the personal data of EU residents, GDPR provisions require you to use data processors that guarantee their ability to implement the technical and organizational requirements of the GDPR. The GDPR also requires you to respond to requests from individuals, or data subjects, to receive a copy of their personal
Welcome to Azure.Source #27! Last week in Azure, we made quite a few announcements about updates to Azure Stream Analytics, which you’ll find captured below. In addition, Microsoft was at the 2018 NAB Show in Las Vegas, which provided an opportunity to reflect on how far Azure’s media services have come over the past year and the exciting future that lays ahead. For more information on that, see the Events section below.
Now in preview
Public preview: Integration of Stream Analytics with Azure Monitor – To improve the self-service troubleshooting experience, integration of Azure Stream Analytics with Azure Monitor is in preview. This integration provides a systematic way to deal with lost, late, or malformed data, while enabling efficient mechanisms to investigate errors caused by bad data.
Now generally available
General availability: Stream Analytics tools for Visual Studio – To help maximize end-to-end developer productivity across authoring, testing, and debugging Stream Analytics jobs, Azure Stream Analytics tools for Visual Studio are now generally available.
News & updates
Seamlessly upgrade Azure SQL Data Warehouse for greater performance and scalability – Upgrade from the Optimized for Elasticity tier to the new Optimized for Compute performance tier with a simple click in the
In the next decade, nearly every consumer gadget, every household appliance, and every industrial device will be connected to the Internet. These connected devices will also become more intelligent with the ability to predict, talk, listen, and more. The companies who manufacture these devices will have an opportunity to reimagine everything and fundamentally transform their businesses with new product offerings, new customer experiences, and differentiate against competition with new business models.
All these everyday devices have in common a tiny chip, often smaller than the size of your thumbnail, called a microcontroller (MCU). The MCU functions as the brain of the device, hosting the compute, storage, memory, and an operating system right on the device. Over 9 billion of these MCU-powered devices are built and deployed every year. For perspective, that’s more devices shipping every single year than the world’s entire human population. While few of these devices are connected to the Internet today, within just a few years, this entire industry, all 9 billion or more devices per year, is on path to include connected MCUs.
Internet connectivity is a two-way street. With these devices becoming a gateway to our homes, workplaces, and sensitive data, they also become targets
Data Integration is becoming more and more complex as customer requirements and expectations are continuously changing. There is increasingly a need among users to develop and debug their Extract Transform/Load (ETL) and Extract Load/Transform (ELT) workflows iteratively. Now, Azure Data Factory (ADF) visual tools allow you to do iterative development and debugging.
You can create your pipelines and do test runs using the Debug capability in the pipeline canvas without writing a single line of code. You can view the results of your test runs in the Output window of your pipeline canvas. Once your test run succeeds, you can add more activities to your pipeline and continue debugging in an iterative manner. You can also Cancel your test runs once they are in-progress. You are not required to publish your changes to the data factory service before clicking Debug. This is helpful in scenarios where you want to make sure that the new additions or changes work as expected before you update your data factory workflows in dev, test or prod environments.
Data Factory visual tools also allow you to do debugging until a particular activity in your pipeline canvas. Simply put a breakpoint on the activity until
https://powerbi.microsoft.com/en-us/blog/last-chance-for-the-data-bi-summit-24-26-april-in-dublin-ireland/Source: https://powerbi.microsoft.com/en-us/blog/last-chance-for-the-data-bi-summit-24-26-april-in-dublin-ireland/ Don’t Miss the Power BI Team at Data & BI Summit, held 24-26 April in Dublin, Ireland.
https://powerbi.microsoft.com/en-us/blog/power-bi-gdpr-whitepaper-is-now-available/Source: https://powerbi.microsoft.com/en-us/blog/power-bi-gdpr-whitepaper-is-now-available/ In May 2018, a European privacy law, the General Data Protection Regulation (GDPR), is due to take effect. The GDPR imposes new rules on companies, government agencies, non-profits, and other…
Has your organization failed to devise a business continuity and disaster recovery plan because of the perception that it’s complex or expensive? Or perhaps you have a disaster recovery plan, but maybe you’re not testing it frequently enough because of concerns about impacting production systems.
If you’re in either category, you’ll want to start developing a plan now. Especially if your company does business in the European Union (EU) or might have any data on EU citizens. The General Data Protection Regulation (GDPR), which goes into effect May 25, 2018, is the EU’s new data protection regulation. While it doesn’t explicitly require that you back up data or implement a site recovery solution, the GDPR requirements provide additional reasons to stop waiting and fine-tune your DR plan:
Under the GDPR, data controllers and data processors must “provide a copy of the personal data undergoing processing”. (Article 15) According to the GDPR, companies must also have “the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident”. (Article 32) GDPR also grants EU citizens the “right to data portability”, (Article 20) which you can’t grant if you lose
In the modern retail environment, consumers are well-informed and expect intuitive, engaging, and informative experiences when they shop. To keep up, retailers need solutions that can help them delight their customers with personalized experiences, empower their workforce to provide differentiated customer experiences, optimize their supply chain with intelligent operations and transform their products and services.
With global scale and intelligence built in to key services, Azure is the perfect platform to build powerful apps to delight retail customers, the possibilities are endless. With a single photo, retailers can create new access points for the customer on a device of their choice. Take a look at this example of what’s possible using Microsoft’s big data and advanced analytics products.
AI can be complex, this is where Rubikloud comes in. Rubikloud is focused on accessible AI products for retailers and delivering on the promise of “intelligent decision automation”. They offer a set of SaaS products, Promotion Manager and Customer Lifecycle Manager, that help retailers automate and optimize mass promotional planning and loyalty marketing. These products help retailers reduce the complexities of promotion planning and store allocations and better predict their customers intention and behavior throughout their retail life cycle.
Azure SQL Data Warehouse (SQL DW) continues to introduce updates to the Azure portal to provide a seamless user experience when monitoring, managing, and integrating your data warehouse.
Support for Azure Monitor metrics
SQL DW now supports Azure Monitor which is a built-in monitoring service that consumes performance and health telemetry for your data warehouse. Azure monitor not only enables you to monitor your data warehouse within the Azure portal, but its tight integration between Azure services also enables you to monitor your entire data analytics solution within a single interface. For this release, data warehouse metrics have been enabled to enables you to identify performance bottlenecks and user activity:
Successful/Failed/Blocked by firewall connections CPU IO DWU Limit DWU Percentage DWU used
These metrics now have a one-minute frequency for near real-time visibility into resource bottlenecks of your data warehouse. There is a default retention period of 90 days for all data warehouse metrics with Azure Monitor.
Pin configured charts for your data warehouse through Azure dashboards:
Safely manage costs by pausing
The pause feature for SQL
https://powerbi.microsoft.com/en-us/blog/join-the-power-bi-team-at-microsoft-build/Source: https://powerbi.microsoft.com/en-us/blog/join-the-power-bi-team-at-microsoft-build/ Microsoft Build is returning to Seattle in May. Build is the ultimate annual Microsoft developer event, where the most innovative minds in tech meet under one roof to get inspired, create tomorrow, and READ MORE