We’re ringing in 2019 by announcing the general availability for the Azure IoT Hub Device Provisioning Service features we first released back in September 2018! The following features are all generally available to you today:
Symmetric key attestation support Re-provisioning support Enrollment-level allocation rules Custom allocation logic
All features are available in all provisioning service regions, through the Azure portal, and the SDKs will support these new features by the end of January 2019 (with the exception of the Python SDK). Let’s talk a little more about each feature.
Symmetric key attestation
Symmetric keys are one of the easiest ways to start off using the provisioning service and provide an easy “Hello world” experience for those of you who want to get started with provisioning but haven’t yet decided on an authentication method. Furthermore, symmetric key enrollment groups provide a great way for legacy devices with limited existing security functionality to bootstrap to the cloud via Azure IoT. Check the docs to learn more about how to connect legacy devices.
Symmetric key support is available in two ways:
Individual enrollments, in which devices connect to the Device Provisioning Service just like they do in IoT Hub. Enrollment groups, in which
Technology allows manufacturers to generate more data than traditional systems and users can digest. Predictive analytics, enabled by big data and cloud technologies, can take advantage of this data and provide new and unique insights into the health of manufacturing equipment and processes. While most manufacturers understand the value of predictive analytics, many find it challenging to introduce into the line of business. Symphony Industrial AI has a mission: to bring the promise of Industrial IoT (IIoT) and artificial intelligence (AI) to reality by delivering real value to their customers through predictive operations solutions. Two solutions by Symphony are specially tailored to the process manufacturing sector (chemicals, refining, pulp and paper, metals and mining, oil, and gas).
There are two solutions offered by Symphony Industrial AI:
The first focuses on existing machinery, and the second on common processes.
Problem: the complexity of data science
Manufacturers have deep knowledge of their manufacturing processes, but they typically lack the expertise of data scientists, who have a deep understanding of statistical modeling, a fundamental component of most predictive analytics applications. And when the application of predictive analytics is a success, most deployments fail to provide users with
Questions about the security of and control over customer data, and where it resides, are on the minds of cloud customers today. We’re hearing you, and in response, we published a whitepaper that gives clear answers and guidance into the security, data residency, data flows, and compliance aspects of Microsoft Azure. The paper is designed to help our customers ensure that their customer data on Azure is handled in a way that meets their data protection, regulatory, and sovereignty requirements.
Transparency and control are essential to establishing and maintaining trust in cloud technology, while restricted and regulated industries have additional requirements for risk management and to ensure ongoing compliance. To address this, Microsoft provides an industry-leading security and compliance portfolio.
Security is built into the Azure platform beginning with the development process, which is conducted in accordance with the Security Development Lifecycle (SDL). Azure also includes technologies, controls, and tools that address data management and governance, such as Active Directory identity and access controls, network and infrastructure security technologies and tools, threat protection, and encryption to protect data in transit and at rest.
Microsoft gives customers options so they can control the types of data and locations where customer data
At Ignite 2018, Microsoft’s Azure Database for PostgreSQL announced the preview of Query Store (QS), Query Performance Insight (QPI), and Performance Recommendations (PR) to help ease performance troubleshooting, in response to customer feedback. This blog intends to inspire ideas on how you can use features that are currently available to troubleshoot some common scenarios.
A previous blog post on performance best practices touched upon the layers at which you might be experiencing issues based on the application pattern that you are using. This blog nicely categorizes the problem space into several areas and the common techniques to rule out possibilities to quickly get to the root cause. We would like to further expand on this with the help of these newly announced features (QS, QPI, and PR).
In order to use these features, you will need to enable data collection by setting pg_qs.query_capture_mode and pgms_wait_sampling.query_capture_mode to ALL.
You can use Query Store for a wide variety of scenarios where you can enable data collection to help with troubleshooting these scenarios better. In this article, we will limit the scope to regressed queries scenario.
One of the important scenarios that Query Store enables you to monitor is the
https://blogs.msdn.microsoft.com/sql_server_team/replica-failover-within-the-secondary-availability-group-in-a-distributed-availability-group/Source: https://blogs.msdn.microsoft.com/sql_server_team/replica-failover-within-the-secondary-availability-group-in-a-distributed-availability-group/ A distributed availability group (DAG) is a special type of availability group that spans two availability groups. This blog will clarify some issues regarding failover in a DAG. A simple DAG ‘TIWENDAG’ is created for relevant tests READ MORE
We are excited to announce the January release of Azure Data Studio (formerly known as SQL Operations Studio) is now available.
Note: If you are currently using the preview version, SQL Operations Studio, and would like to retain your settings when upgrading to the latest version, please follow these instructions. After downloading Azure Data Studio, click “Yes” to enable preview features so that you can use extensions.
Azure Data Studio is a new cross-platform desktop environment for data professionals using the family of on-premise and cloud data platforms on Windows, MacOS, and Linux. To learn more, visit our GitHub.
Azure Data Studio was announced Generally Available at Microsoft Ignite 2018. If you missed it, you can view that GA announcement here. You wont want to miss the great orthogonality matrix that compares SQL Server Management Studio (SSMS) and Azure Data Studio, and it may provide answers to many of your questions.
Check out the video below for a general overview of Azure Data Studio.
The key highlights for the January release include:
Azure Active Directory Authentication support Announcing Data-Tier Application Wizard support Announcing IDERA SQL DM Performance
In this post, we will explore how to use automated machine learning (AutoML) to create new machine learning models over your data in SQL Server 2019 big data clusters.
SQL Server 2019 big data clusters make it possible to use the software of your choice to fit machine learning models on big data and use those models to perform scoring. In fact, Apache SparkTM, the popular open source big data framework, is now built in! Apache SparkTM includes the MLlib Machine Learning Library, and the open source community has developed a wealth of additional packages that integrate with and extend Apache SparkTM and MLlib.
Automated machine learning
Manually selecting and tuning machine learning models requires familiarity with a variety of model types and can be laborious and time-consuming. Software for automating this process has recently become available, relieving both novice and expert Data Scientists and ML Engineers of much of the burden that comes with manual model selection and tuning.
H2Os open source AutoML APIs
H2O provides popular open source software for data science and machine learning on big data, including Apache SparkTM integration. It provides two open source python AutoML classes: h2o.automl.H2OAutoML and pysparkling.ml.H2OAutoML. Both APIs use the same
It’s become a reliable January tradition for manufacturers to introduce an amazing array of consumer devices at the Consumer Electronics Show (CES). These new devices enter a booming market, and Gartner predicts “14.2 billion connected things will be in use in 2019, and that the total will reach 25 billion by 2021.” This year connected devices will dominate CES again as device manufacturers lean further into their vision for the smart home. These devices’ promise is resonant – smart home experiences will remove friction from our day-to-day lives, save us money, keep us healthy, and help us lower our environmental footprints—ultimately, empowering us all to achieve more.
As people open their personal lives and spaces to these smart experiences, they’re also becoming increasingly attuned to the security risks that smart technology can introduce. Their concern builds as news headlines give shape to the many ways that smart devices are being weaponized by attackers to invade personal privacy, steal sensitive data, and take down infrastructure with scaled attacks.
We set out to better understand how the allure of smart device experiences stacks up against the concern for security, and who consumers hold responsible to secure smart devices. To this end, we
Each year at CES, we see dozens of new product innovations that bring additional convenience, entertainment, efficiency – or completely new experiences to our daily lives. By bringing the power of the cloud to connected devices, the Internet of Things (IoT) and artificial intelligence (AI) have played an ever-expanding role in driving the connected product business opportunity. Today, smart thermostats, speakers, TVs, appliances, cars, and more are no longer serving an “early adopter” market – they are entering the mainstream – as people look for technology to help enrich how they plan and experience their daily lives.
Our Azure IoT and AI strategy enables customers to build these new products and solutions using the power of the intelligent cloud and intelligent edge, at scale. The Azure IoT platform helps customers build consistent AI-based applications and experiences from the cloud to the edge, that are adaptive and responsive to physical environments – from smart cities and spaces to connected products in homes and on the manufacturing floor. Our Azure AI services combine the latest advances in technologies like machine learning and deep learning, with our comprehensive data, Azure cloud and productivity platform, and a trusted, enterprise grade approach.
We are continuing
Since our preview announcement, hundreds of customers have been moving recurring workloads, media captures from automobiles, incremental transfers for ongoing backups, and archives from remote/office branch offices (ROBOs) to Microsoft Azure. We’re excited to announce the general availability of Azure Data Box Disk, an SSD-based solution for offline data transfer to Azure. Data Box Disk is now available in the US, EU, Canada, and Australia, with more country/regions to be added over time. Also, be sure not to miss the announcement of the public preview for Blob Storage on Azure Data Box below!
Top three reasons customers use Data Box Disk Easy to order and use: Each disk is an 8 TB SSD. You can easily order a pack(s) of up to five disks from the Azure portal for a total capacity of 40 TB per order. The small form-factor provides the right balance of capacity and portability to collect and transport data in a variety of use cases. Support is available for Windows and Linux. Fast data transfer: These SSD disks copy data up to USB 3.1 speeds and support the SATA II and III interfaces. Simply mount the disks as drives and use any tool of choice such