When you are working with a database, or any other kind of software, your experience is enhanced or hindered by the tools you use to interact with it. PostgreSQL has a command line tool, psql, and it’s pretty powerful, but some people much prefer a graphical editor. Even if you typically use command line, you may want to go visual sometimes. At Microsoft we’ve spent many years building experiences to enhance developers’ day-to-day productivity. Having choices is important. It allows you to go with the tool that works for you.
Today we’re excited to announce preview support for PostgreSQL in Azure Data Studio. Azure Data Studio is a cross-platform modern editor focused on data development. It’s available for Linux, MacOS, and Windows. Plus, Azure Data Studio comes with an integrated terminal so you’re never far away from psql.
We’re also introducing a corresponding preview PostgreSQL extension in Visual Studio Code (VS Code). Both Azure Data Studio and VS Code are open source and extensible – two things that PostgreSQL itself is based on.
Azure Data Studio inherits a lot of VS Code functionality. It also supports most of VS Code’s extensions like Python, R, and Kubernetes support. If your primary
We’re excited to announce a partnership with Timescale that introduces support for TimescaleDB on Azure Database for PostgreSQL for customers building IoT and time-series workloads. TimescaleDB has a proven track record of being deployed in production in a variety of industries including oil & gas, financial services, and manufacturing. The partnership reinforces our commitment to supporting the open-source community to provide our users with the most innovative technologies PostgreSQL has to offer.
TimescaleDB allows you to scale for fast ingest and complex queries while natively supporting full SQL. It leverages PostgreSQL as an essential building block, which means that users get the familiarity and reliability of PostgreSQL, along with the scalability and performance of TimescaleDB. Enabling TimescaleDB on your new or existing Azure Database for PostgreSQL server will eliminate the need to run two databases to collect relational and time-series data.
How to get started
If you don’t already have an Azure Database for PostgreSQL server, you can create one with the Azure CLI command az postgres up. Next, run the following command to add TimescaleDB to your Postgres libraries:
az postgres server configuration set –resource-group mygroup –server-name myserver –name shared_preload_libraries –value timescaledb
Restart the server to load the
While Azure Container Registry (ACR) supports user and headless-service account authentication, customers have expressed their requirements for limiting public endpoint access. Customers can now limit registry access within an Azure Virtual Network (VNet), as well as whitelist IP addresses and ranges for on-premises services.
VNet and Firewall rules are supported with virtual machines (VM) and Azure Kubernetes Services (AKS).
Choosing between private and PaaS registries
As customers move into production, their security teams have a checklist they apply to production workloads, one of which is limiting all public endpoints. Without VNet support, customers had to choose between standalone products, or OSS projects they could run and manage themselves. This puts a larger burden on the customers to manage the storage, security, scalability, and reliability a production registry requires.
With VNet and Firewall rules, customers can achieve their security requirements, while benefiting from integrated security, secured at rest, geo-redundant, and geo-replicated PaaS Container Registry. Thus, freeing up their resources to focus on the unique business problems they face.
Azure Container Registry PaaS, enabling registry products
The newest VNet and Firewall rule capabilities of ACR are just the latest set of capabilities in container lifecycle management. ACR provides core primitives that
How do you back up your SQL Servers today? You could be using backup software that require you to manage backup servers, agents, and storage, or you could be writing elaborate custom scripts which need you to manage the backups on each server individually. With the modernization of IT infrastructure and the world rapidly moving to the cloud, do you want to continue using the legacy backup methods that are tedious, infrastructure-heavy, and difficult to scale? Azure Backup for SQL Server Virtual Machines (VMs) is the modern way of doing backup in cloud, and we are excited to announce that it is now generally available! It is an enterprise scale, zero-infrastructure solution that eliminates the need to deploy and manage backup infrastructure while providing a simple and consistent experience to centrally manage and monitor the backups on standalone SQL instances and Always On Availability Groups.
Built into Azure, the solution combines the core cloud promises of simplicity, scalability, security and cost effectiveness with inherent SQL backup capabilities that are leveraged by using native APIs, to yield high fidelity backups and restores. The key value propositions of this solution are:
15-minute Recovery Point Objective (RPO): Working with uber critical
Furthering our commitment to be the most trusted cloud for Government, today Microsoft is proud to announce two milestone achievements in support of the US Department of Defense.
Information Impact Level 5 DoD Provisional Authorization by the Defense Information Systems Agency
Azure Government is the first commercial cloud service to be awarded an Information Impact Level 5 DoD Provisional Authorization by the Defense Information Systems Agency. This provisional authorization allows all US Department of Defense (DoD) customers to leverage Azure Government for the most sensitive controlled unclassified information (CUI), including CUI of National Security Systems.
DoD Authorizing Officials can use this Provisional Authorization as a baseline for input into their authorization decisions on behalf of mission owner systems using the Azure Government cloud DOD Region.
This achievement is the result of the collective efforts of Microsoft, DISA and its mission partners to work through requirements pertaining to the adoption of cloud computing for infrastructure, platform and productivity across the DoD enterprise.
General Availability of DoD Regions
Information Impact Level 5 requires processing in dedicated infrastructure that ensures physical separation of DoD customers from non-DoD customers. Over the past few months, we ran a preview program with more than 50 customers
This blog post was authored by Kareem Choudhry, Corporate Vice President, Microsoft Gaming Cloud.
Microsoft is built on the belief of empowering people and organizations to achieve more – it is the DNA of our company. Today we are announcing a new initiative, Microsoft Game Stack, in which we commit to bringing together Microsoft tools and services that will empower game developers like yourself, whether you’re an indie developer just starting out or a AAA studio, to achieve more.
This is the start of a new journey, and today we are only taking the first steps. We believe Microsoft is uniquely suited to deliver on that commitment. Our company has a long legacy in games – and in building developer-focused platforms.
There are 2 billion gamers in the world today, playing a broad range of games, on a broad range of devices. There is as much focus on video streaming, watching, and sharing within a community as there is on playing or competing. As game creators, you strive every day to continuously engage your players, to spark their imaginations, and inspire them, regardless of where they are, or what device they’re using. Today, we’re introducing Microsoft Game Stack,
I’m excited to announce the release of our first Azure Blueprint built specifically for a compliance standard, the ISO 27001 Shared Services blueprint sample which maps a set of foundational Azure infrastructure, such as virtual networks and policies, to specific ISO controls.
Microsoft Azure leads the industry with over 90 compliance offerings. Azure meets a broad set of international and industry-specific compliance standards, such as General Data Protection Regulation (GDPR), ISO 27001, HIPAA, PCI, SOC 1 and SOC 2, as well as country-specific standards, including FedRAMP and other NIST 800-53 derived standards, Australia IRAP, UK G-Cloud, and Singapore MTCS. Many of our customers have expressed their interest in being able to leverage and build upon our internal compliance practices for their environments with a service that maps compliance settings automatically.
To help our customers simplify the creation of their environments in Azure while successfully interpreting US and international governance requirements, we are announcing a series of built-in Blueprints Architectures that can be leveraged during your cloud-adoption journey. Azure Blueprints is a free service that helps customers deploy and update cloud environments in a repeatable manner using composable artifacts such as policies, deployment templates, and role-based access controls. This service is
Azure Databricks provides a fast, easy, and collaborative Apache® Spark™-based analytics platform to accelerate and simplify the process of building big data and AI solutions that drive the business forward, all backed by industry-leading SLAs.
With Azure Databricks, you can set up your Spark environment in minutes and autoscale quickly and easily. You can also apply your existing skills and collaborate on shared projects in an interactive workspace with support for Python, Scala, R, and SQL, as well as data science frameworks and libraries like TensorFlow and PyTorch.
We’re continuously listening to customers and answering questions as we evolve this service. This blog outlines important service announcements that we are proud to deliver for our customers.
Azure Databricks Delta available in Standard and Premium SKUs
Azure Databricks Delta brings new levels of reliability and performance for production workloads based on a number of improvements including transaction support, schema validation, indexing, and data versioning.
Since the preview of Delta was announced, we have received overwhelmingly positive feedback on how it has helped customers build complex pipelines for both batch and streaming data, and simplified ETL pipelines. We are excited to announce that Delta is now available in our Standard SKU offering
APIs have become mundane. They have become the de facto standard for connecting apps, data, and services. In the larger picture, APIs are driving digital transformation in organizations.
With the strategic value of APIs, a continuous integration (CI) and continuous deployment (CD) pipeline has become an important aspect of API development. It allows organizations to automate deployment of API changes without error-prone manual steps, detect issues earlier, and ultimately deliver value to end users faster.
This blog walks you through a conceptual framework for implementing a CI/CD pipeline for deploying changes to APIs published with Azure API Management.
Organizations today normally have multiple deployment environments (e.g., Development, Testing, Production) and use separate API Management instances for each environment. Some of these instances are shared by multiple development teams, who are responsible for different APIs with different release cadences.
As a result, customers often come to us with the following challenges:
How to automate deployment of APIs into API Management? How to migrate configurations from one environment to another? How to avoid interference between different development teams who share the same API Management instance?
We believe the approach described below will address all these challenges.
CI/CD with API Management
Azure Databricks provides a fast, easy, and collaborative Apache Spark™-based analytics platform to accelerate and simplify the process of building big data and AI solutions backed by industry leading SLAs.
With Azure Databricks, customers can set up an optimized Apache Spark environment in minutes. Data scientists and data engineers can collaborate using an interactive workspace with languages and tools of their choice. Native integration with Azure Active Directory (Azure AD) and other Azure services enables customers to build end-to-end modern data warehouse, machine learning and real-time analytics solutions.
We have seen tremendous adoption of Azure Databricks and today we are excited to announce new capabilities that we are bringing to market.
General availability of Data Engineering Light
Customers can now get started with Azure Databricks with a new low-priced workload called Data Engineering Light that enables customers to run batch applications on managed Apache Spark. It is meant for simple, non-critical workloads that don’t need the performance, autoscaling, and other benefits provided by Data Engineering and Data Analytics workloads. Get started with this new workload.
Additionally, we have reduced the price for the Data Engineering workload across both the Standard and Premium SKUs. Both the SKUs are now available at