This morning, at the Microsoft Build conference in Seattle, I talked about the key areas of new Azure innovation that enable the intelligent cloud and intelligent edge – spanning developer tools, DevOps, containers, serverless, Internet of Things (IoT) and artificial intelligence (AI).
Innovation starts with developers writing code. The effectiveness of your dev tools are at the heart of your ideas becoming reality. With this in mind, we continue to deliver new innovation and experiences with Visual Studio tools. Whether it is Visual Studio, VS Code or Visual Studio Team Services for DevOps, we are committed to providing the most productive developer experience end-to-end. Today, we announced a preview of Visual Studio IntelliCode, that brings AI to everyday development by providing intelligent suggestions that improve code quality and productivity. We also announced the preview of Live Share, which lets developers collaborate on their code and problem solve across Visual Studio and VS Code. Finally, building on our shared commitment to developers and open source, we also announced a fantastic partnership with GitHub where Visual Studio App Center will be natively available in GitHub via their marketplace. This means any GitHub developer building mobile apps for iOS, Android, Windows and macOS
Special thanks to Robert Venable, Principal Software Engineer in the Finance Engineering team of Core Services Engineering (formerly Microsoft IT) for sharing their story of enabling development teams while ensuring security and compliance. Thanks also to Scott Hoag, Principal Cloud Solutions Architect at Opsgility and Rob Dendtler, Account Technology Strategist at Microsoft for reviewing and providing invaluable feedback.
One of the common questions members of the Core Services Engineering and Operations teams frequently get when speaking to customers at the Executive Briefing Center here in Redmond is how do our engineering teams secure our Azure footprint for our Line of Business applications while still giving developers the freedom to go fast, have visibility into our environment and use the capabilities of Visual Studio Team Services for CI/CD, Release, and much more.
At the core of this answer is how we use the combination of subscriptions, resource groups, and Role Based Access Control to ensure compliance with a set of guidelines.
Let’s start at the top level: Azure Subscriptions. CSEO, as you can imagine has a lot of Line of Business applications, currently over a thousand. We loosely follow the business unit pattern from the Azure enterprise scaffold – prescriptive subscription
We are happy to announce that HDInsight Tools for VSCode now supports argparse and accepts parameter based Pyspark Job submission. We also enabled the tools to support Spark 2.2 for PySpark author and job submission.
The argparse feature grants you great flexibility for your PySpark code author, test and job submission for both batch and interactive query. You can fully enjoy the advantage of PySpark argparse, and simply keep your configuration and your job-related arguments in the Json based configuration file.
The Spark 2.2 update allows you to benefit the new functionalities and to consume the new libraries and APIs from Spark 2.2 in VSCode. You can create, author and submit a Spark 2.2 PySpark job to Spark 2.2 cluster. With the backward compatibility of Spark 2.2, you can also submit your existing Spark 2.0 and Spark 2.1 PySpark scripts to a Spark 2.2 cluster.
Summary of key new features Argparse support – set up your arguments in Json format. Set up configurations: Go to command palate, choose command HDInsight: Set Configuration.
2. Set up the parameters in the xxx_hdi_settings.json file, including script to cluster, Livy configuration, Spark configuration, etc.
Spark 2.2 Support – Submit PySpark batch
This blog post was co-authored by Barry Luijbregts, Azure MVP.
If you’re a .NET developer, we’re excited to introduce a new resource that can help you learn about Azure: The Azure Quick Start Guide for .NET Developers!
This guide shows .NET developers how they can start with Azure and get the most out of it. The e-book is also great for .NET developers who already use Azure and want to learn more about which Azure services are available to them, and the tools they can use to develop applications for the platform.
Specifically, the Azure Quick Start Guide for .NET Developers covers:
What Azure can do for you as a .NET developer. The guide describes how Azure can take care of things like automatic scaling, continuous integration/continuous delivery (CI/CD), and much more so you can focus on creating the things that really matter and add value. A catalog that explains what Azure services to use when for: Running your .NET applications in Azure. There are many services that you can use to run your application in Azure. For instance, the guide discusses the differences between running your app in a service like Azure App
This post series provides the latest updates and news for Visual Studio Team Services and is a great way for Azure users to keep up-to-date with new features being released every three weeks. Visual Studio Team Services offers the best DevOps tooling to create an efficient continuous integration and release pipeline to Azure. With the rapidly expanding list of features in Team Services, teams can start to leverage it more efficiently for all areas of their Azure workflow, for apps written in any language and deployed to any OS.
Chain related builds together using build completion triggers
Large products have several components that are dependent on each other. These components are often independently built. When an upstream component (a library, for example) changes, the downstream dependencies have to be rebuilt and revalidated. Teams typically manage these dependencies manually.
Now you can trigger a build upon the successful completion of another build. Artifacts produced by an upstream build can be downloaded and used in the later build, and you can also get data from these variables: Build.TriggeredBy.BuildId, Build.TriggeredBy.BuildDefinitionId, Build.TriggeredBy.BuildDefinitionName. See the build triggers documentation for more information.
This feature was prioritized based on what is currently the second-highest voted suggestion with
To provide more authentication options, Azure Toolkit for Eclipse now supports integration with HDInsight clusters through Ambari for job submission, cluster resource browse and storage files navigate. You can easily link or unlink any cluster by using an Ambari-managed username and password, which is independent of your Azure sign-in credentials. The Ambari connection applies to normal Spark and Hive hosted within HDInsight on Azure. These additions give you more flexibility in how you connect to your HDInsight clusters in addition to your Azure subscriptions while also simplifying your experiences in submitting Spark jobs.
With this release, you can benefit the new functionalities and consume the new libraries & APIs from Spark 2.2 in Azure Toolkit for Eclipse. You can create, author and submit a Spark 2.2 project to Spark 2.2 cluster. With the backward compatibility of Spark 2.2, you can also submit your existing Spark 2.0 and Spark 2.1 projects to a Spark 2.2 cluster.
How to link a cluster Click Link a cluster from Azure Explorer.
Enter Cluster Name, Storage Account, Storage Key, then select a container from Storage Container, at last, input Username and Password. Click the OK button to link cluster.
Please note that you
To provide more authentication options, Azure Toolkit for IntelliJ now supports integration with HDInsight clusters through Ambari for job submission, cluster resource browse and storage files navigate. You can easily link or unlink any cluster by using an Ambari-managed username and password, which is independent of your Azure sign-in credentials. The Ambari connection applies to normal Spark and Hive hosted within HDInsight on Azure. These additions give you more flexibility in how you connect to your HDInsight clusters in addition to your Azure subscriptions while also simplifying your experiences in submitting Spark jobs.
With this release, you can benefit the new functionalities and consume the new libraries & APIs from Spark 2.2 in Azure Toolkit for IntelliJ. You can create, author and submit a Spark 2.2 project to Spark 2.2 cluster. With the backward compatibility of Spark 2.2, you can also submit your existing Spark 2.0 and Spark 2.1 projects to a Spark 2.2 cluster.
How to link a cluster Click Link a cluster from Azure Explorer.
Enter Cluster Name, Storage Account, Storage Key, then select a container from Storage Container, at last, input Username and Password.
Please note that you can use either Ambari username, pwd or
We recently released a port of our Azure IoT Hub C SDK for iOS platform. Whether your iOS project is written in Swift or Objective-C, you can leverage our device SDK and service SDK directly and begin turning your iOS device into an IoT device! Our libraries are available on CocoaPod, a popular package manager for iOS, and the source code is available on GitHub.
iOS devices are traditionally not viewed as IoT devices, but recently, they are getting traction in the IoT space. Here are some of the interesting scenarios we gathered from our industry customers during the preview phase:
iOS device as the gateway for leaf devices or sensors on the factory floor. iOS device in a meeting room, which acts as an end IoT device to send and receive messages from Azure IoT Hub. iOS device to view the visualization of IoT telemetry. iOS device to manage IoT Hub operations.
So, what is in the box? If you have interacted with our Azure IoT Hub C SDK before, this would be familiar to you! Our C SDK is written in C99 for maximum portability to various platforms. The porting process involves writing a thin adoption layer for
In the past, Azure customers on Enterprise Agreement (EA) have subscriptions that are centrally controlled by the company’s cloud operations or IT team. When a team or employee in the company wants to start using Azure, they need to get access to the EA enrollment so that it gets billed to the company EA. To do that, the employee or team makes a request to the central cloud operations team, go through approval, and have an Azure subscription provisioned as prescribed by the company’s cloud governance policies. During this process, an EA subscription must be manually created using the Azure Account Center. As these company’s Azure adoption increases, the manual step in creating subscriptions becomes a bottleneck in scalability in their cloud management.
To unblock these customers, we’ve created an API and a suite of SDK for Azure EA subscription creation.
Get started with Azure EA subscription creation API
To get started, see documentation at Programmatically create Azure enterprise subscriptions (preview) and our sample code. In this release, you can
Create an Azure EA subscription (regular or dev/test) as an Account Owner. Use Azure RBAC to give another user or service principal to create subscriptions on behalf of an Account
Azure SQL Data Warehouse (SQL DW) continues to introduce updates to the Azure portal to provide a seamless user experience when monitoring, managing, and integrating your data warehouse.
Support for Azure Monitor metrics
SQL DW now supports Azure Monitor which is a built-in monitoring service that consumes performance and health telemetry for your data warehouse. Azure monitor not only enables you to monitor your data warehouse within the Azure portal, but its tight integration between Azure services also enables you to monitor your entire data analytics solution within a single interface. For this release, data warehouse metrics have been enabled to enables you to identify performance bottlenecks and user activity:
Successful/Failed/Blocked by firewall connections CPU IO DWU Limit DWU Percentage DWU used
These metrics now have a one-minute frequency for near real-time visibility into resource bottlenecks of your data warehouse. There is a default retention period of 90 days for all data warehouse metrics with Azure Monitor.
Pin configured charts for your data warehouse through Azure dashboards:
Safely manage costs by pausing
The pause feature for SQL