We continue to expand the Azure Marketplace ecosystem. From April 1st to 15th, 20 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
(Basic) Apache NiFi 1.4 on Centos 7.4: A CentOS 7.4 VM running a basic install of Apache NiFi 1.4 using default configurations. Once the virtual machine is deployed and running, Apache NiFi can be accessed via web browser.
Ethereum developer kit (techlatest.net): If you are looking to get started with Ethereum development and want an out-of-the-box environment to get up and running in minutes, this VM is for you. It includes the Truffle Ethereum framework, a world-class development environment.
xID: eXtensible IDentity (xID) is an open (standards based), modular (componentized architecture), secure (security built-in), and pluggable (adaptor-based integration approach) product built specially for delivering your organization’s identity management needs.
Qualys Virtual Scanner Appliance: Qualys Virtual Scanner Appliance helps you get a continuous view of security and compliance, putting a spotlight on your Microsoft Azure cloud infrastructure. It’s a stateless resource that acts as an extension to the Qualys Cloud Platform.
FileCloud on Ubuntu Linux: FileCloud
Looking forward to Microsoft Build? Now you’ve got one more reason. After three days of can’t-miss tech sessions and skill-sharpening workshops, we’re throwing an awesome party for attendees at Seattle Center.
We’ll celebrate with an evening of music, games, exhibits, and more at these world-famous Seattle sites, open exclusively to Microsoft Build attendees on the evening of May 9, 2018, starting at 7:30 PM:
Drop by MoPOP, designed by internationally acclaimed architect Frank O. Gehry, and lose yourself in the latest exhibition dedicated to your favorite comic books, Marvel: Universe of Super Heroes. Discover everything you ever wanted to know about the greatest guitarists of all time, grunge rock legend Kurt Cobain and Nirvana, the iconic Captain Kirk and the rest of the Star Trek crew, classic horror movies, sci-fi masterpieces, and more. The museum has tons of unique exhibits and hands-on experiences to check out. And you won’t want to miss the famous Sky Church with state-of-the-art acoustics and a soaring 65-foot ceiling — the perfect setting for the evening’s live music entertainment. Get your silent groove on at Next 50 Plaza, just outside MoPOP. Grab a headset, pick your DJ, and dance to your favorite beat in
We are excited to announce the public preview of Network Performance Monitor’s (NPM) new capability to monitor Microsoft peering in ExpressRoute. This complements NPM’s generally available capability to monitor ExpressRoute’s Azure Private peering. Throughout this post, I will walk you through some of the important use cases of the capability.
Get complete visibility into the ExpressRoute connection to the Microsoft services
It can become extremely difficult to identify the bottleneck when a performance degradation is experienced while accessing a Microsoft service over ExpressRoute. This is because an ExpressRoute connection comprises of various components. With this capability, you can now get the required end-to-end visibility through NPM’s interactive topology view. You can not only view all the constituent components – on-premises network, circuit provider edge, ExpressRoute circuit, Microsoft edge, but also the latency contributed by each hop to help you identify the troublesome segment.
The following snippet illustrates a topology view where the on-premises computer on the left is connected to the Microsoft service (outlook.office365.com) on the right, over primary and secondary ExpressRoute Microsoft peering connections. The service provider router at the customer edge and the Microsoft router at the Azure edge are also depicted. The on-premises hops (depicted by dashed
Today, I am thrilled to announce the general availability of Global VNet Peering in all Azure public regions, empowering you to take the ease, simplicity, and isolation of VNet peering to the next level.
Azure’s Virtual Network (VNet) is a logical isolation of Azure which enables you to securely connect Azure resources to each other. VNet lets you create your own private space in Azure – your own network bubble, as I like to call it.
With Global VNet Peering available, you can enable connectivity across all Azure public regions without additional bandwidth restrictions and as always keeping all your traffic on the Microsoft Backbone. Global VNet Peering provides you with the flexibility to scale and control how workloads connect across geographical boundaries, unlocking and applying global scale to a plethora of scenarios such as data replication, database failover, and disaster recovery through private IP addresses. You can also share resources across different business unit VNets, the hub-and-spoke model, as we refer to it, through a global peering connection. As your organization grows across geographic boundaries, you can continue to share resources like firewalls or other virtual appliances via peering.
Continuing our series of tutorials on SaaS application patterns with SQL Database, we are delighted to announce an additional cross tenant analytics tutorial. This new tutorial shows how to extract and load tenant data into Azure SQL Data Warehouse (SQL DW) using Azure Data Factory (ADF) and then analyze it in Power BI.
In this tutorial, ADF is used to orchestrate data movement from tenant databases into a SQL Data Warehouse. Parameterized ADF V2 (preview) pipelines are defined to iterate across tenant databases, loading data from multiple databases in parallel. To accelerate loading, ADF stages extracted data in blob files and then uses PolyBase to load into SQL DW. Staging the data and enabling PolyBase are simple check-box operations in ADF.
The tutorial uses an Extract, Load and Transform (ELT) pattern – once data is loaded into staging tables in SQL DW, ADF invokes a stored procedure to upsert the data into star-schema tables, ready for query. Power BI is then used to visualize the data and extract insights that in the tutorial scenario can help the ISV improve their ticket selling application and business.
To get started check out the analytics tutorial which provides step-by-step
Azure Event Hubs is expanding its ecosystem to support more languages. Azure Event Hubs is a highly scalable data-streaming platform processing millions of events per second. Event Hubs uses Advanced Message Queuing Protocol (AMQP 1.0) to enable interoperability and compatibility across platforms. Now, with the addition of new clients, you can easily get started with Event Hubs.
We are happy to have the new client libraries for Go, Python, and Node.js in public preview. Do your application logging or click stream analytics pipelines, live Dashboarding, or any telemetry processing with our rich ecosystem offering language of your choice.
Ingest and consume events/logs from your Python applications or stream with your Node.js applications or simply integrate with your Go applications. You now have a wide palette to choose from based on your needs.
The following updates will provide more insights into the public preview of the new client libraries.
Event Hubs for Go, this new package offers you easy-to-use Send and Receive functions which communicates with the Event Hubs service using the AMQP 1.0 protocol as implemented by the github.com/vcabbage/amqp. What more? It also offers the Event Processor Host to manage load balancing and lease management for the consumers. The readme helps
https://blogs.msdn.microsoft.com/sql_server_team/replication-enhancement-improved-distribution-database-cleanup/Source: https://blogs.msdn.microsoft.com/sql_server_team/replication-enhancement-improved-distribution-database-cleanup/ Replication Distribution database is the perhaps the most important entity in the entire replication topology. Apart from storing the replication metadata and history for replication agents, the distribution database, in case of transactional replication, also provides intermediate READ MORE
Secure credential management is essential to protect data in the cloud. With Azure Key Vault, you can encrypt keys and small secrets like passwords that use keys. Azure Data Factory is now integrated with Azure Key Vault. You can store credentials for your data stores and computes referred in Azure Data Factory ETL (Extract Transform Load) workloads in an Azure Key Vault. Simply create Azure Key Vault linked service and refer to the secret stored in the Key vault in your data factory pipelines.
Azure Data Factory will now automatically pull your credentials for your data stores and computes from Azure Key Vault during pipeline execution. Using Key Vault, you don’t need to provision, configure, patch, and maintain key management software. Just provision new vaults and keys in minutes. Centrally manage keys, secrets, policies and refer to the keys in your data pipelines in data factory. You keep control over your keys by simply granting permission for your own and data factory service to use them as needed. Data Factory never has direct access to keys. Developers manage keys used for Dev/Test and seamlessly migrate to producing the keys that are managed by security operations.
With Azure Key
Today we are announcing the general availability release of AzCopy on Linux. AzCopy is a command line data transfer utility designed to move large amounts of data to and from Azure Storage with optimal performance. It is designed to handle transient failures with automatic retries, as well as to provide a resume option for failed transfers. This general availability release includes new and enhanced features, as well as performance improvements thanks to the feedback we received during the Preview.
You can get started with the latest AzCopy release following the documentation.
What’s new? Throughput improvements up to 3X
Investments in performance improvements and leveraging .Net Core 2.1 have boosted the AzCopy throughput significantly. In our tests, we have seen up to three times the improvement in throughput for large, multiple files as well as up to two times the throughput improvement in scenarios where millions of small files are transferred.
AzCopy now packages .NET Core 2.1 thereby eliminating the need to manually install .NET Core as a pre-requisite. You can now extract the AzCopy package, and start using. You might however need to install the .NET Core dependencies in some Linux distributions. Please consult the documentation for the
Microsoft Build 2018 is only a short week away on May 7-9. Whether you can’t make it to Seattle or you just want to enhance your on-the-ground experience at the event, Microsoft Build Live brings you live as well as on-demand access to three days of inspiring speakers, spirited discussions, and virtual networking. The livestream gives you another way to connect, spark ideas, and deepen your engagement with the latest ideas in the cloud, AI, mixed reality, and more.
Now in preview
How Azure Security Center helps detect attacks against your Linux machines – Azure Security Center (ASC) is now extending its Linux threat detection preview program, both on cloud and on-premise. New capabilities include detection of suspicious processes, suspect login attempts, and anomalous kernel module loads. Security Center is using auditd for collecting machines’ events, which is one of the most common frameworks for auditing on Linux. Any Linux machine that runs auditd by default and is covered by Security Center will benefit from this public preview. This post will also walk you through an example of detection analytics in action in the form of malicious crypto coin mining. It also includes some tips on using Azure Log Analytics