We are seeing more developers building and running their applications in the public cloud. In fact, companies are using multiple public clouds to run their applications. Our customers tell us that they choose to build applications in Azure because it’s easy to get started and that they have peace of mind knowing the services that their applications rely on will be available, reliable, and secure. Today, we are going to discuss how Azure Security Center’s Just-in-Time VM Access can help you secure virtual machines that are running your applications and code.
Successful attacks on your virtual machines can create serious challenges for development. If a server is compromised, your source code could potentially be exposed, along with the proprietary algorithms or internal knowledge about the application. The pace of development can slow down because your team is focused on recovering from the attack instead of writing and reviewing code. Most importantly, an attack can affect your customers’ abilities to access your applications, impacting your brand and your business. Just-in-Time VM Access can help you reduce your exposure to attacks by limiting the amount of time management ports are open on the virtual machines running your code.
Just-in-Time VM Access
mssql-cli is a new and interactive command line query tool for SQL Server. This open source tool works cross-platform and is part of the dbcli community.
See the install guide to download mssql-cli and get started.
mssql-cli auto-completion in action
In this release (v0.13.0), highlights include:
Apt-get and Linux packaging support GDPR Compliance New demo video Apt-get and Linux packaging support
One of the key issues Linux users run into when setting up mssql-cli for the first time is not having the right version of Python or having to install Python for the first time. We want to make the first experience with mssql-cli painless, thus we added apt-get support in order to package Python with your installation to help improve the acquisition experience.
For full instructions to acquire mssql-cli for each Linux distribution, please check out the Linux installation guide.
Note: For those who already installed mssql-cli via pip install, please run
sudo pip uninstall mssql-cli
Then, follow the installation instructions.
As many of us are familiar with, GDPR is
With Azure Data Factory (ADF) visual tools, we listened to your feedback and enabled a rich, interactive visual authoring and monitoring experience. It allows you to iteratively create, configure, test, deploy and monitor data integration pipelines without any friction. The main goal of the ADF visual tools is to allow you to be productive with ADF by getting pipelines up and running quickly without requiring to write a single line of code.
We continue to add new features to increase productivity and efficiency for both new and advanced users with intuitive experiences. You can get started by clicking the Author and Monitor tile in your provisioned v2 data factory blade.
Check out some of the exciting new features enabled with data factory visual tools since public preview (January 2018):
Latest data factory updates
Follow exciting new updates to the data factory service.
View Data Factory deployment region, and resource group. Then, switch to another data factory that you have access to.
Ingest data at scale from more than 70 on-premises/cloud data sources in a serverless fashion.
New activities in toolbox Notebook Activity: Ingest data at scale using more than 70 on-premises/cloud
This is a partner post authored with the Tableau team
As part of Microsofts open and flexible platform for data and analytics, were always excited when partners add features that will expand customer options and extend business functionality. Business intelligence partner Tableau has recently introduced enhancements that will make it easier for their users to process spatial data stored in SQL Server.
The SQL Server database engine has two spatial types geometry and geography. The geography spatial type helps organize geospatial mapping data into SQL Server tables and works with several SQL-native graphing functions to answer questions like how far apart two geographic locations are, or what locations fall within a certain radius.
Before, customers had to process geography spatial data stored in SQL Server into Shapefiles before they could access it from Tableau. Now with Tableau 2018.1, customers can connect to and visualize data stored in SQL Server directly. This means Tableau will recognize spatial data in your SQL Server tables without any intermediate steps, and customers can leverage spatial operations for geography supported by SQL Server or custom queries to work with geographic data stored in SQL Server. Alongside support for native spatial data, Tableau 2018.1 also includes
https://powerbi.microsoft.com/en-us/blog/just-announced-bestselling-author-malcolm-gladwell-to-keynote-at-microsoft-business-applications-summit/Source: https://powerbi.microsoft.com/en-us/blog/just-announced-bestselling-author-malcolm-gladwell-to-keynote-at-microsoft-business-applications-summit/ We’re bringing our BizApps community together for the first-ever Microsoft Business Applications Summit, *the* place for Dynamics 365, Power BI, Excel, PowerApps and Flow superfans to connect, collaborate and pack in as much READ MORE
https://powerbi.microsoft.com/en-us/blog/announcing-dashboard-theming-in-the-power-bi-service/Source: https://powerbi.microsoft.com/en-us/blog/announcing-dashboard-theming-in-the-power-bi-service/ Today, I am thrilled to announce the availability of dashboard theming in the Power BI service. Power BI dashboards pull together reports, images, Excel workbooks, and more, to provide a 360-degree…
Kafka on HDInsight is Microsoft Azure’s managed Kafka cluster offering. It has generated huge customer interest and excitement since its general availability in December 2017.
Here are 10 great things about it:
1. Enterprise grade Kafka
HDInsight Kafka is an enterprise-grade managed cluster service giving you a 99.9 percent SLA. We take care of the heavy lifting for you, including operations, monitoring, security patching, and more. Running some of the largest Kafka clusters on the planet, we have the expertise and knowledge to prevent and quickly fix any issues.
2. Up and running in minutes
Create a full cluster in minutes, not days. Traditionally, it takes weeks to provision your own Kafka cluster on premises or on IaaS. With HDInsight, you get your own dedicated Kafka cluster within a few clicks.
3. Real open-source Kafka
With HDInsight, you get the true Apache Kafka engine running on your clusters with all its functionality. People love Kafka for its high throughput, lowest latency and cost-efficient scale. Along with the Producer and Consumer APIs, Kafka also has a rich feature set, such as compression of messages for an even higher throughput, configurable retention policy (including retention beyond 7 days and size based
Last week, thousands of developers gathered in Seattle for Microsoft Build 2018, Microsoft’s annual developer conference, to learn how we are empowering them to advance the future of society with AI and the intelligent edge. This week’s Azure.Source covers a lot of content. For a more concise (TL;DR) summary of Build 2018, check out the Microsoft press release, Microsoft Build highlights new opportunity for developers, at the edge and in the cloud. For a roll-up of all the Azure announcements, see: Azure Announcements from Microsoft Build. And if you want to dig into all of the content delivered at Build 2018 session-by-session, you can now do so at Microsoft Build Live 2018.
New Azure innovations are helping developers write code today for tomorrow’s technology challenges – Scott Guthrie’s technology keynote at Build 2018 focused on the key areas of new Azure innovation that enable the intelligent cloud and intelligent edge – spanning developer tools, DevOps, containers, serverless, Internet of Things (IoT), and artificial intelligence (AI).
Technology Keynote: Microsoft Azure – Watch the keynote to see how we are helping every developer be an AI developer on Azure. Learn about and see demos of
Since the release in 2016, developers are using our Azure IoT Python SDK to write device and back-end applications to connect to Azure IoT Hub and Device Provisioning Service, as well as writing modules for Azure IoT Edge (preview). Python is a popular choice for prototyping, and it is gaining traction in the embedded world.
If you decide to use the Python SDK for development, there are few things you should keep in mind: Python SDK is a wrapper on top of our Azure IoT C SDK, and we release binary packages on pip for Windows, Ubuntu, and Raspbian, all of which are compatible with Python 2 and Python 3. This approach has its ups and downs. On the upside, the features you see in C are available in Python with no functionality difference. On the downside, it is not a native Python SDK. Application Programming Interfaces (APIs) that we exposed may look different from typical Python APIs; and as a developer, you will need to pay attention the architecture of underlying platform, especially when you are using pip!
At a high level, there are three things that must align to reference the Python SDK properly: Python version (2 or
Gaining insights rapidly from data is critical to competitiveness in today’s business world. Azure SQL Data Warehouse (SQL DW), Microsoft’s fully managed analytics platform, leverages Massively Parallel Processing (MPP) to run complex interactive SQL queries at every level of scale.
An enterprise data warehouse doesn’t exist in isolation but is part of a complete analytics solution that includes the ‘pipelines’ that extract, transform, and load data, often from many disparate sources, as well as staging and reporting layers. Tasks such as continuous integration and continuous delivery require extensive manual coding and doesn’t offer the flexibility needed for the rapidly changing business needs. This is where data warehouse automation tools provide value for customers.
Data warehouse automation (DWA) tools are meta-data driven, code generation tools that streamline developing and managing a data warehouse solution. DWA tools provide more than just ETL automation, they automate the complete life cycle of a data warehouse solution, from analysis, design, and implementation to documentation, monitoring and maintenance.
Benefits of data warehouse automation
Data warehouse automation (DWA) tools bring the benefits of meta-data driven automation, and code generation to streamline developing and managing a data warehouse solution. DWA tools provide more than just ETL automation, they