Access Control Service is scheduled to be retired on November 7, 2018 with an extension option to February 4, 2019
The Access Control Service, otherwise known as ACS, is officially being retired. Since the ACS retirement announcement last year, many customers have reached out for our guidance and have completed their migration. Some customers communicated that their migration has started but will not complete before the deadline. We have decided to offer an extension to February 4, 2019. Failure to opt into the extension will result in your namespace being turned off on November 7, 2018. At that point, all requests to the namespace will fail.
What action is required?
If you are using ACS, you will need a migration strategy. The correct migration path for you depends on how your existing apps and services use ACS. We have published migration guidance to assist. In most cases, migration will require code changes on your part.
The Azure customers most likely to find ACS namespaces signed up Azure Service Bus prior to 2014. These namespaces can be identified by their –sb extension. The Service Bus team has provided migration guidance and will continue to publish updates to their blog.
One of the biggest security and compliance requirements for enterprise customers is to encrypt their data at rest using their own encryption key. This is even more critical in a post-GDPR world. Today, we’re announcing the public preview of Bring Your Own Key (BYOK) for data at rest in Apache Kafka on Azure HDInsight.
Azure HDInsight clusters already provide several levels of security. At the perimeter level, traffic can be controlled via Virtual Networks and Network Security Groups. Kerberos authentication and Apache Ranger provide the ability to finely control access to Kafka topics. Further, all managed disks are protected via Azure Storage Service Encryption (SSE). However, for some customers it is vital that they own and manage the keys used to encrypt the data at rest. Some customers achieve this by encrypting all Kafka messages in their producer applications and decrypting them in their consumer applications. This process is cumbersome and involves custom logic. Moreover, it doesn’t allow for usage of community supported connectors.
With HDInsight Kafka’s support for Bring Your Own Key (BYOK), encryption at rest is a one step process handled during cluster creation. Customers should use a user-assigned managed identity with the Azure Key Vault (AKV) to
Frost and Sullivan has estimated the Artificial Intelligence (AI) Market for Healthcare IT in 2018 for hospitals at $574 million USD and expects it to grow at a CAGR of 65 percent to $4.3 billion USD in 2022! Similar explosive growth is predicted across other segments of healthcare. Machine Learning (ML) is a type of AI that has already seen successful application in healthcare, particularly rapid growth, and shows major untapped potential to further help improve healthcare going forward. Key use cases range from resource and asset optimization to readmission prevention, chatbots, anti-fraud, behavioral analytics, medical risk analytics, claims analytics, cybersecurity, and many more. Business values driving healthcare organizations to deploy AI solutions across these use cases span cost reduction, improving patient outcomes, and improving the engagement and experiences of patients and healthcare professionals. Major opportunities for startups range from creating AI products and solutions for specific use cases and healthcare needs to services for education, customizing AI solutions, integrating them with existing enterprise systems and data stores, managing and operating solutions, and so forth. See the AI in Healthcare Guide for more information on these use cases and opportunities.
Below I review key goals of any healthcare AI startup,
#FuelMyAwesome is back to celebrate developers like you, and we want to hear about all the things that make you tick and keep you inspired. Whether that’s a lucky beanie or a cold brew, a delightful desk toy or a fun way to get fit, we want you to share it with us for a chance to win cool swag.
Starting October 8, 2018, at 9:00 AM Pacific Time, @msdev will tweet weekly questions to uncover how you get in the zone, celebrate wins, recharge your batteries, and more. Reply directly to the weekly Monday tweets – including the hashtags #FuelMyAwesome and #sweepstakes – by Thursday at 11:59 PM Pacific Time to be entered for a chance to win*.
And remember to check back every Monday for a brand-new question and another chance to win.
Ready to start celebrating you? We are – because you are AWESOME.
FAQs How do I enter? Visit twitter.com/msdev. Find the current week’s #FuelMyAwesome tweet that is pinned to the top of the timeline during each entry period. Reply using the speech bubble/comment icon underneath the @msdev tweet with the following required elements: Answering the prompted question. No graphic or GIF is required.
Organizations around the world are gearing up for a future powered by data, cloud, and Artificial Intelligence (AI). This week at Spark + AI Summit Europe, I talked about how Microsoft is committed to delivering cutting-edge innovations that help our customers navigate these technological and business shifts.
The driving force behind powerful AI applications is data – and getting the most out of AI requires a modern data estate. Organizations are using their data to extract important insights to drive their businesses forward and engage their customers in ways that were simply not possible before. One such example is the Real Madrid Football Club, one of the world’s top sports franchises with 500 million fans worldwide. Real Madrid built a global digital sports platform to engage one-on-one with fans, implement personalized promotional campaigns, and use data to track and analyze fan behaviors, among many other capabilities. This data-driven strategy has led to a 400 percent increase in installed fan base, and a 30 percent increase in digital revenue growth for the club.
“We used to pull data from just five sources before, but now we pull from more than 70 sources using our Microsoft Azure platform. This has enabled
A Microsoft Ignite 2018 retrospective
As you probably know by now, many thousands of people descended on Orlando, Florida last week for Microsoft Ignite 2018. In this volume of Azure.Source, I’ve gathered in one place the links you need to dig through all of the announcements we made. As you will read and see below, a LOT happened last week.
Microsoft Ignite 2018 – Vision keynote – Microsoft CEO, Satya Nadella, kicks off Ignite and Envision 2018 with this keynote that lays out the Microsoft vision that will empower every person and every organization on the planet to achieve more.
IT and developer success with Microsoft Azure – Microsoft Azure gives you the freedom to build, manage, and deploy cloud native and hybrid applications on a massive, global cloud using your favorite tools and frameworks. Join Scott Guthrie, EVP Cloud + AI & Julia White, CVP Azure, as they demonstrate the latest advances.
Microsoft Ignite | The Tour – If you missed out on attending Ignite in Orlando, don’t fret – Ignite may be coming to a city near you over the coming months. Join us at the place where developers and tech professionals continue learning alongside experts.
Office 365 holds a wealth of information about how people work and how they interact and collaborate with each other, and this valuable asset enables intelligent applications to derive value and to optimize organizational productivity. Today application developers use Microsoft Graph API to access Office 365 in a transactional way. This approach however is not efficient if you need to analyze over large amount of Office artifacts across a long time horizon. Further, Office 365 data is isolated from other business data and systems, leading to data silos and untapped opportunity for additional insights.
Azure offers a rich set of hyperscale analytics services with enterprise-grade security and are available in data centers worldwide. By marrying Office 365 data and Azure, Office 365 data can be available in Azure and developers can harness the full power of Azure to build highly scalable and secure applications against the combination of Office 365 data and other business data.
This week at Ignite we announced the Public Preview of Microsoft Graph data connect, which enables secured, governed, and scalable access of Office 365 data in Azure. With this offering, for the very first time, all your data – organizational, customer, transactional, external
Insurance companies that sell life, health, and property and casualty insurance are using machine learning (ML) to drive improvements in customer service, fraud detection, and operational efficiency. For example, the Azure cloud is helping insurance brands save time and effort using machine learning to assess damage in accidents, identify anomalies in billing, and more.
Here are some common use cases for ML in insurance, along with resources for getting started with ML in Azure.
Eight ML use cases to improve service, optimization, automation, and scale Lapse management: Identifies policies that are likely to lapse, and how to approach the insured about maintaining the policy. Recommendation engine: Given similar customers, discovers where individual insureds may have too much, or too little, insurance. Then, proactively help them get the right insurance for their current situation. Assessor assistant: Once a car has been towed to a body shop, use computer vision to help the assessor identify issues which need to be fixed. This helps accuracy, speeds an assessment, and keeps the customer informed with any repairs. Property analysis: Given images of a property, identifies structures on the property and any condition issues. Insurers can proactively help customers schedule repairs by identifying issues
Gaining insights rapidly from data is critical to competitiveness in today’s business world. Azure SQL Data Warehouse (SQL DW), Microsoft’s fully managed analytics platform leverages Massively Parallel Processing (MPP) to run complex interactive SQL queries at every level of scale.
Users today expect data within minutes, a departure from traditional analytics systems which used to operate on data latency of a single day or more. With the requirement for faster data, users need ways of moving data from source systems into their analytical stores in a simple, quick, and transparent fashion. In order to deliver on modern analytics strategies, it is necessary that users are acting on current information. This means that users must enable the continuous movement from enterprise data, from on-premise to cloud and everything in-between.
SQL Data Warehouse is happy to announce that Striim now fully supports SQL Data Warehouse as a target for Striim for Azure. Striim enables continuous non-intrusive performant ingestion of all your enterprise data from a variety of sources in real time. This means that users can use intelligent pipelines for change data capture from sources such as Oracle Exadata straight into SQL Data Warehouse. Striim can also be used to move fast
If you are a manufacturer who wants to take its first steps towards IoT, and you’re overwhelmed by the plethora of vendors and IoT platforms in the IoT space, you are not alone. IoT is still a new space, with many moving parts and products. This makes it hard for organizations to know exactly where and how to get started. In this blog, I will try to provide you a simplified overview and next steps, based on the conversations I have been having with many manufacturing organizations.
Components of an IoT solution
When it comes to deciding whether to build or buy your IoT solution it is important, of course, to understand exactly what you are building or buying. To that end, it helps to identify the main components of an IoT solution stack (Figure 1). From bottom to top:
Figure 1: Building Blocks of an IoT Solution
1. Cloud platform: a set of general-purpose PaaS services used by developers to develop cloud-based solutions. These services include messaging, storage, compute, security, and more. Cloud platforms (such as Microsoft Azure) also include analytics services and IoT services.
2. IoT platform: A set of IoT-specific PaaS and SaaS services and development