Ansible 2.7 will be released 4 October 2018. Here I would like to share with you what are upcoming for Azure in Ansible 2.7. In total, 21 new Azure modules were added. In 2.7 you now have the ability to natively automate the deployment and configuration of below Azure resources.
Azure Web Apps: Create and configure your Azure Web Apps hosting web applications, REST APIs, and mobile backends using Ansible.
Azure Traffic Manager: Create and configure Azure Traffic Manager to distribute traffic optimally to services across global Azure regions using Ansible.
Azure Database: Create and configure an Azure Database for SQL/MySQL/PostgreSQL server using Ansible.
Azure Route: Create and configure your own routes to override Azure’s default routing using Ansible.
Azure Applicate Gateway: Create and configure an Azure Application Gateway to manage web traffic using Ansible.
Azure Autoscale: Create and configure Azure autoscale to help applications perform their best when demand changes using Ansible.
Additional facts module for VM and ACR: Get information about virtual machines or Azure container registry for further configuration using Ansible.
The Spark + AI summit Europe kicks-off in just a few days in London. Microsoft and many of their customers using Azure Databricks are present during the Summit. Azure Databricks is a first party service on Azure, allowing customers to accelerate big data analytics and artificial intelligence (AI) solutions with a fast, easy, and collaborative Apache SparkTM–based analytics service. Having such a platform improves developer productivity with a single, consistent set of APIs and developers can mix and match different kinds of processing within the same environment. Azure Databricks also improves performance by eliminating unnecessary movement of data across environments.
Here are a few recommended sessions you might find interesting, where customers and partners share success stories leveraging Azure Databricks:
For Oil & Gas Moving Towards AI: Learn from an actual customer how they are leveraging deep learning with Azure Databricks to implement a solution that enables them to detect safety incidents at their gas stations. Also learn how they were able to build an Advanced Analytics COE to lead AI projects across the organization. For Retail Co-op’s Transformation from Brick and Mortar to AI with Databricks: In this session, learn from the head of data within a consumer co-operative
Azure Stack features a growing independent software vendor (ISV) community that operates across a broad-spectrum of environments, empowering you to create compelling and powerful solutions.
Today, I’d like to highlight some of our ISV partners that address common customer requirements for Azure Stack.
Data protection and disaster recovery
Operators and users of Azure Stack deploying applications and datasets need the ability to quickly recover from data loss and catastrophic failures. With offerings from multiple partners, you can enable data protection and disaster recovery for your applications and data. Supported partners include: Acronis, Actifio, Carbonite, Commvault, Dell EMC, MicroFocus, Quest, Rubrik, Veritas, and ZeroDown. The blog post, Protecting applications and data on Azure Stack, by my colleague Hector Linares, provides an overview of how to protect your applications and data.
Vulnerability and policy compliance scanning for the Azure Stack infrastructure is now a reality thanks to the integration of Azure Stack with Qualys. The Qualys virtual scanner appliance is also coming to the Azure Stack marketplace to enable users to protect their workloads. Backing up your application’s secrets on a Hardware Secure Module (HSM) will soon be available thanks to the Azure Stack marketplace solution CipherTrust Cloud Key Manager (CCKM)
When we talk to our customers about their cloud strategy, we continue to hear loud and clear from many of them that a hybrid approach to cloud makes the most sense for their business. Organizations choose a hybrid strategy for many reasons including to take advantage of the cloud while leveraging existing on-premises technology as an asset, to help them achieve greater flexibility, or to meet regulatory requirements.
Customers tell us they look for a cloud platform that enables choices, flexibility, and consistency of experience across their full environment. Microsoft deeply understands this reality. Based on our decades of enterprise experience, we designed Azure to be hybrid from day one, to offer true consistency across cloud and on-premises with the broadest set of hybrid capabilities in applications, data, identity, security, and management.
Customers choose Azure to power their hybrid cloud strategies
More than 95 percent of Fortune 500 companies trust their business on Azure today, and many of them take advantage of Azure hybrid capabilities to fuel innovation and deliver great business outcomes.
Running its $15 billion business on Azure using a hybrid cloud model, Smithfield Foods slashed datacenter costs by 60 percent, accelerated application delivery from two months to
Azure Files is the only public cloud file storage that delivers secure, Server Message Block (SMB) based, fully managed cloud file shares that can also be cached on-premises for performance and compatibility. Today, we are excited to announce that Azure Files just got bigger, faster, and better than ever before.
That’s right … starting today we are launching the previews of larger, low latency file shares with more IOPS, higher throughput, and Azure Active Directory integration. But wait, there’s more! We are also excited to announce the forthcoming new Azure File Sync agent with significant improvements in the sync performance, tiering, and reporting.
Let’s take a sneak peek into the new Azure Files …
Files scale and performance
Premium Files, new and now in limited public preview, is optimized to deliver consistent performance for IO-intensive enterprise workloads that require high throughput with single digit millisecond latency. Premium Files is suitable for a wide variety of workloads such as databases, home directories, analytics, content, and collaboration. Its performance, whether measured by IOPS or throughput, scales with the provisioned capacity to fit your workload’s needs. With premium file shares, IOPS can scale up to 100x and capacity can scale up to
Microsoft has reached a milestone in text-to-speech synthesis with a production system that uses deep neural networks to make the voices of computers nearly indistinguishable from recordings of people. With the human-like natural prosody and clear articulation of words, Neural TTS has significantly reduced listening fatigue when you interact with AI systems.
Our team demonstrated our neural-network powered text-to-speech capability at the Microsoft Ignite conference in Orlando, Florida, this week. The capability is currently available in preview through Azure Cognitive Services Speech Services.
Neural text-to-speech can be used to make interactions with chatbots and virtual assistants more natural and engaging, convert digital texts such as e-books into audiobooks and enhance in-car navigation systems.
The milestone in text-to-speech joins a string of breakthroughs that our group has achieved over the past two years, including human parity in conversational speech recognition and human parity in machine translation.
Our text-to-speech capability uses deep neural networks to overcome the limits of traditional text-to-speech systems in matching the patterns of stress and intonation in spoken language, called prosody, and in synthesizing the units of speech into a computer voice.
Traditional text-to-speech systems break down prosody into separate linguistic analysis and acoustic prediction steps that
We are excited to host you all at this year’s Ignite conference. The Azure Stack team has put together a list of sessions along with a pre-day event to ensure that you will enhance your skills on Microsoft’s hybrid cloud solution and get the most out of this year’s conference.
We have an agenda that is tailored for developers who use Azure Stack to develop innovative hybrid solutions using services on Azure Stack and Azure, as well as operators who are responsible for the operations, security, and resiliency of Azure Stack itself. Whether you’re a developer or an IT operator, there’s something for you.
To fully benefit from our sessions we recommended you attend our two overview talks, “Intelligent Edge with Azure Stack” and “Azure Stack Overview and Roadmap”. If you’re looking to learn how to operate Azure Stack, we recommend you attend “The Guide to Becoming an Azure Stack Operator” to learn what it takes to get the most of your investment. If you’re just “Getting started with Microsoft Azure Stack as a developer”, we’ve created a path for you as well. See the learning map below:
Content creators and broadcasters are increasingly embracing Cloud’s global reach, hybrid model and elastic scale. These attributes combined with AI’s ability to accelerate insights and time to market across content creation, management, and monetization are truly transformative.
At the International Broadcasters Conference (IBC) Show 2018, we are focused on bringing Cloud + AI together to help you overcome common media workflow challenges.
Video Indexer, generally available starting today, is a great example of this Cloud + AI focus. It brings together the power of the cloud + Microsoft AI to intelligently analyze your media assets, extract insights and add metadata. It makes it easier to understand your vast content library and get the more than 20 new and improved models, easy to use interfaces, a single API, and simplified account management. I have been part of Video Indexer team since its inception and could not be more excited to see it reach GA. I’m also incredibly proud of the work the team has done to solve real customer problems and make AI tangible in this easy to use elegant solution.
Our partners are already innovating on top of Video Indexer and extending Azure Media Services to advance the state of
Media and entertainment industry conferences are by far some of my favorites. Creativity, disruption, opportunity, and technology – particularly cloud, edge, and AI – are everywhere. It’s been exciting to see those things come together at NAB 2018, SIGGRAPH, and now IBC Show 2018. Together with teams from across Microsoft, I’m looking forward to IBC Show and the chance to learn, collaborate, and advance the state of this dynamic industry.
At this year’s IBC we’re excited to announce the general availability of Video Indexer, our advanced metadata extraction service. Announced as public preview earlier this year, Video Indexer provides a rich set of cross-channel (audio, speech, and visual) learning models. Check out Sudheer’s blog for more information on all the new capabilities including emotion detection, topic inferencing, and improvements to the ever-popular celebrity recognition model that recognizes over one million faces.
Video Indexer is just one of the ways Azure is helping customers like Endemol Shine, Multichoice, RTL, and Ericsson with their content needs. At IBC 2018, our teams are excited to share new ways that Azure, together with solutions from our partners, can address common media workflow challenges.
How? Well, read on…
More visual effects and animations mean
This blog post was co-authored by Nile Wilson, Software Engineer Intern, Microsoft.
In an earlier post, we explored how several of the top teams at this year’s Imagine Cup had Artificial Intelligence (AI) at the core of their winning solutions. From helping farmers identify and manage diseased plants to helping the hearing-impaired, this year’s finalists tackled difficult problems that affect people from all walks of life.
In this post, we take a closer look at the champion project of Imagine Cup 2018, smartARM.
Samin Khan and Hamayal Choudhry are the two-member team behind smartARM. The story begins with a by-chance meeting of these middle school classmates. Studying machine learning and computer vision at the University of Toronto, Samin decided to register for the January 2018 UofTHacks hackathon, and coincidentally ran into Hamayal, studying mechatronics engineering at the University of Ontario Institute of Technology. Despite his background, Hamayal was more interested in spectating than in participating. But when catching up, they realized that by combining their skillsets in computer vision, machine learning, and mechatronics, they might just be able to create something special. With this realization, they formed a team at the hackathon and have been working together since.