We are happy to announce that Accelerated Networking (AN) is generally available (GA) and widely available for Windows and the latest distributions of Linux providing up to 30Gbps in networking throughput, free of charge!
AN provides consistent ultra-low network latency via Azure’s in-house programmable hardware and technologies such as SR-IOV. By moving much of Azure’s software-defined networking stack off the CPUs and into FPGA-based SmartNICs, compute cycles are reclaimed by end user applications, putting less load on the VM, decreasing jitter and inconsistency in latency.
With the GA of AN, region limitations have been removed, making the feature widely available around the world. Supported VM series include D/DSv2, D/DSv3, E/ESv3, F/FS, FSv2, and Ms/Mms.
The deployment experience for AN has also been improved since public preview. Many of the latest Linux images available in the Azure Marketplace, including Ubuntu 16.04, Red Hat Enterprise Linux 7.4, CentOS 7.4 (distributed by Rogue Wave Software), and SUSE Linux Enterprise Server 12 SP3, work out of the box with no further setup steps needed. Windows Server 2016 and Windows Server 2012R2 also work out of the box.
All the information needed to deploy a VM with AN can be found here, Windows AN VM
We have had tons of interest in our VMware virtualization on Azure offering. This includes questions about what we are offering and how we will provide an enterprise grade solution. Here are some of the details on the preview.
To enable this solution, we are working with multiple VMware Cloud Provider Program partners and running on existing VMware-certified hardware. For example, our preview hardware will use a flexpod bare metal configuration with NetApp storage. This hosted solution is similar to Azure’s bare metal SAP HANA Large Instances solution that we launched last year. With this approach, we will enable you to use the same industry-leading VMware software and services that you currently use in your on-premises datacenters, but running on Azure infrastructure, allowing L3 network connectivity for existing applications to Azure-native services like Azure Active Directory, Azure Cosmos DB, and Azure Functions.
We are facilitating discussions with VMware and the VCPP partners to ensure you have a great solution and a great support experience when we make this offering generally available next year. More details from VMware on this can be found here. We will share more information on GA plans and partners in the coming months. If you’d like
Azure HDInsight is a fully-managed cloud service that makes it easy, fast, and cost-effective to process massive amounts of data. Use the most popular open-source frameworks such as Hadoop, Spark, Hive, LLAP, Kafka, Storm, R & more. Azure HDInsight enables a broad range of scenarios such as ETL, Data Warehousing, Machine Learning, and IoT.
By default, when you provision a HDInsight cluster, you are required to create a local admin user and local SSH user that has full access to the cluster. The local admin user can access all the files, folders, tables, columns, etc. With a single local user, there is no need for role-based access control. However, as enterprise customers move to the cloud, they must enable strict security requirements in terms of authentication, authorization, auditing, and governance. This is especially important with larger or multiple teams that share the same cluster. Admins don’t want to create individual clusters for individual users. When we talked to customers, we received three main requests as part of enabling cluster access to multiple users:
As a data scientist, I want to use my Active Directory domain credentials to run queries on the cluster. As a cluster admin, I want to configure
Today, we are really happy to announce that we are reducing the prices for Azure HDInsight service and making several awesome capabilities generally available.
Launched in 2013, Azure HDInsight is a fully-managed, full spectrum, open-source analytics cloud service by Microsoft that makes it easy, fast, and cost-effective to process massive amounts of data. You can use the most popular open-source engines such as Hadoop, Spark, Hive, LLAP, Kafka, Storm, HBase, R and install more open source frameworks from the OSS ecosystem.
Amazing value for our customers
Customers ranging from startups to enterprises are using Azure HDInsight for their mission-critical applications. The service enables a broad range of scenarios in manufacturing, retail education, nonprofit, government, healthcare, media, banking, telecommunication, insurance and many more industries ranging in use cases from ETL to Data Warehousing, from Machine Learning to IoT and many more. Many Fortune 500 customers are running their big data pipelines on Azure HDInsight:
AccuWeather is using this technology to gain real-time intelligence into weather and business patterns. Handling 17 billion requests for data each day, AccuWeather is helping 1.5 billion people safeguard and improve their lives and businesses.
Cornell Lab of Ornithology improved Machine Learning Workflow with Azure
Apache Kafka on the Azure HDInsight was added last year as a preview service to help enterprises create real-time big data pipelines. Since then, large companies such as Toyota, Adobe, Bing Ads, and GE have been using this service in production to process over a million events per sec to power scenarios for connected cars, fraud detection, clickstream analysis, and log analytics. HDInsight has worked very closely with these customers to understand the challenges of running a robust, real-time production pipeline at an enterprise scale. Using our learnings, we have implemented key features in the managed Kafka service on HDInsight, which is now generally available.
A fully managed Kafka service for the enterprise use case
Running big data streaming pipelines is hard. Doing so with open source technologies for the enterprise is even harder. Apache Kafka, a key open source technology, has emerged as the de-facto technology for ingesting large streaming events in a scalable, low-latency, and low-cost fashion. Enterprises want to leverage this technology, however, there are many challenges with installing, managing, and maintaining a streaming pipeline. Open source bits lack support and in-house talent needs to be well versed with these technologies to ensure the highest levels of
I am excited to announce the general availability of HDInsight Integration with Azure Log Analytics.
Azure HDInsight is a fully managed cloud service for customers to do analytics at scale using the most popular open-source engines such as Hadoop, Hive/LLAP, Presto, Spark, Kafka, Storm, HBase etc.
Thousands of our customers run their big data analytical applications on HDInsight at global scale. The ability to monitor this infrastructure, detect failures quickly and take quick remedial action is key to ensuring a better customer experience.
Log Analytics is part of Microsoft Azure’s overall monitoring solution. Log Analytics helps you monitors cloud and on-premises environments to maintain availability and performance.
Our integration with log analytics will make it easier for our customers to operate their big data production workloads more effective and simple manner.
Monitor & debug full spectrum of big data open source engines at global scale
Typical big data pipelines utilize multiple open source engines such as Kafka for Ingestion, Spark streaming or Storm for stream processing, Hive & Spark for ETL, Interactive Query [LLAP] for blazing fast querying of big data.
Additionally, these pipelines may be running in different datacenters across the globe.
With new HDInsight monitoring
This post was authored by the Azure Bot Service and Language Understanding Team.
Microsoft brings the latest advanced chatbot capabilities to developers’ fingertips, allowing them to create apps that see, hear, speak, understand, and interpret users’ needs — using natural communication styles and methods.
Today, we’re excited to announce we’re making generally available Microsoft Cognitive Services Language Understanding service (LUIS) and Azure Bot Service, two top notch AI services to create digital agents that interact in natural ways and make sense of the surrounding environment.
Think about the possibilities: all developers regardless of expertise in data science able to build conversational AI that can enrich and expand the reach of applications to audiences across a myriad of conversational channels. The app will be able to understand natural language, reason about content and take intelligent actions. Bringing intelligent agents to developers and organizations that do not have expertise in data science is disruptive to the way humans interact with computers in their daily life and the way enterprises run their businesses with their customers and employees.
Through our preview journey in the past two years, we have learned a lot from interacting with thousands of customers undergoing digital transformation. We highlighted
Conversational AI, or making human and computer interactions more natural, has been a goal since technology became ubiquitous in our society. Our mission is to bring conversational AI tools and capabilities to every developer and every organization on the planet, and help businesses augment human ingenuity in unique and differentiated ways.
Today, I’m excited to announce Microsoft Azure Bot Service and Microsoft Cognitive Services Language Understanding (LUIS) are both generally available.
Azure Bot Service enables developers to create conversational interfaces on multiple channels while Language Understanding (LUIS) helps developers create customized natural interactions on any platform for any type of application, including bots. Making these two services generally available on Azure simultaneously extends the capabilities of developers to build custom models that can naturally interpret the intentions of people conversing with bots.
This announcement delivers on our AI Platform approach, providing developers and data scientists with all the tools they need to create AI applications in the cloud and on mobile devices. In November, at Connect(); 2017, we released tools to infuse AI into new and existing applications quickly and easily with updates to Azure Machine Learning (AML) including Azure IoT Edge integration, as well as new Visual Studio Tools
Enterprise customers choose Azure because of the unique value it provides as a productive, hybrid, intelligent and trusted cloud. Today I’m excited to announce four new management and cost savings capabilities. Azure Policy, now in public preview, provides control and governance at scale for your Azure resources. Azure Cost Management is rolling out the support for Azure Virtual Machine Reserved Instances management later this week to help you maximize savings over time.. To continue our commitment to making Azure cost-effective, we are reducing the prices of up to 4% on our Dv3 Series in several regions in the coming days, and making our lowest priced Storage tier Azure Archive Storage generally available today.
Simple ways to ensure a secure and well-managed cloud infrastructure
Azure is committed to providing a secure cloud foundation, while making available a comprehensive set of services to ensure that your cloud resources are secure and well-managed. Cloud security and management is a joint responsibility between Microsoft and the customer. We recommend that customers follow secure and well-managed cloud best practices for every production virtual machine. To help you achieve this goal, Azure has built-in services that can be configured quickly, are always up to date and
Today we’re excited to announce the general availability of Archive Blob Storage starting at an industry leading price of $0.002 per gigabyte per month! Last year, we launched Cool Blob Storage to help customers reduce storage costs by tiering their infrequently accessed data to the Cool tier. Organizations can now reduce their storage costs even further by storing their rarely accessed data in the Archive tier. Furthermore, we’re also excited to announce the general availability of Blob-Level Tiering, which enables customers to optimize storage costs by easily managing the lifecycle of their data across these tiers at the object level.
From startups to large organizations, our customers in every industry have experienced exponential growth of their data. A significant amount of this data is rarely accessed but must be stored for a long period of time to meet either business continuity or compliance requirements; think employee data, medical records, customer information, financial records, backups, etc. Additionally, recent and coming advances in artificial intelligence and data analytics are unlocking value from data that might have previously been discarded. Customers in many industries want to keep more of these data sets for a longer period but need a scalable and cost-effective solution