We are excited to provide several updates for the QnA Maker service. To see previous releases for Conversational AI from Microsoft in March, see this post.
New Bot Framework v4 Template for QnA Maker
The QnA Maker service lets you easily create and manage a knowledge base from your data, including FAQ pages, support URLs, PDFs, and doc files. You can test and publish your knowledge base and then connect it to a bot using a bot framework sample or template. With this update we have simplified the bot creation process by allowing you to easily create a bot from your knowledge base, without the need for any code or settings changes. Find more details on creating a QnA bot on our tutorials page.
After you publish your knowledge base, you can create a bot from the publish page with the Create Bot button. If you have previously created bots, you can click on “View all” to see all the bots that are linked to your current subscription.
This will lead you to a create template in the Azure portal with all your knowledge base details pre-filled in. Your KB ID is connected to the template automatically, and your
In a few weeks, over 22,000 people from around the globe will converge in Orlando, Florida from May 7-9, 2019 for the SAP Sapphire NOW and ASUG Annual Conference. Each year, the event brings together thought leaders across industries to find innovative ways to solve common challenges, unlock new opportunities, and take advantage of emerging technologies that are changing the business landscape as we know it. This year, Microsoft has elevated its presence to the next level with engaging in-booth experiences and informative sessions that will educate, intrigue, and inspire attendees as they take the next step in their digital transformation journey.
Modernize your SAP landscapes
While running SAP on-premises was once business as usual, it is quickly becoming obsolete for businesses looking to compete and win. With the power of the cloud, enterprises have real-time data with intelligent insights from machine learning and artificial intelligence at their fingertips, can spin up a dev-test environment or an application server in minutes instead of hours, and back-up a virtual machine in a few mouse clicks.
At SAP SAPPHIRE NOW, you’ll have the opportunity to get a better understanding on the business value of moving your SAP applications to Azure:
DevOps is the union of people, processes, and products to enable the continuous delivery of value to end users. DevOps for machine learning is about bringing the lifecycle management of DevOps to Machine Learning. Utilizing Machine Learning, DevOps can easily manage, monitor, and version models while simplifying workflows and the collaboration process.
Effectively managing the Machine Learning lifecycle is critical for DevOps’ success. And the first piece to machine learning lifecycle management is building your machine learning pipeline(s).
What is a Machine Learning Pipeline?
DevOps for Machine Learning includes data preparation, experimentation, model training, model management, deployment, and monitoring while also enhancing governance, repeatability, and collaboration throughout the model development process. Pipelines allow for the modularization of phases into discrete steps and provide a mechanism for automating, sharing, and reproducing models and ML assets. They create and manage workflows that stitch together machine learning phases. Essentially, pipelines allow you to optimize your workflow with simplicity, speed, portability, and reusability.
There are four steps involved in deploying machine learning that data scientists, engineers and IT experts collaborate on:
Data Ingestion and Preparation Model Training and Retraining Model Evaluation Deployment
Together, these steps make up the Machine Learning pipeline. Below is
When it comes to executing a machine learning project in an organization, data scientists, project managers, and business leads need to work together to deploy the best models to meet specific business objectives. A central objective of this step is to identify the key business variables that the analysis needs to predict. We refer to these variables as the model targets, and we use the metrics associated with them to determine the success of the project.
In this use case, available to the public on GitHub, we’ll see how a data scientist, project manager, and business lead at a retail grocer can leverage automated machine learning and Azure Machine Learning service to reduce product overstock. Azure Machine Learning service is a cloud service that you use to train, deploy, automate, and manage machine learning models, all at the broad scale that the cloud provides. Automated machine learning within Azure Machine Learning service is the process of taking training data with a defined target feature, and iterating through combinations of algorithms and feature selections to automatically select the best model for your data based on the training scores.
Excess stock quickly becomes a liquidity problem, as it is not converted
Five examples of how Azure AI is driving innovation
Whether you’re just starting off in tech, building, managing, or deploying apps, gathering and analyzing data, or solving global issues —anyone can benefit from using cloud technology. Below we’ve gathered five cool examples of innovative artificial intelligence (AI) to showcase how you can be a catalyst for real change.
You know that old box of photos you have sitting in the attic collecting cobwebs; the one with those beautifully embarrassing childhood photos half-covered by a misplaced thumb? How grateful would your family be if you could bring those back to life digitally, at the tip of your fingers? Manually scanning and downloading photos to all your devices would be a huge pain. And if those photos don’t have dates or the names of the people in them written on the back — forget it! But with AI algorithms, cognitive services, and facial recognition processes, organizing these photos by groups is super simple.
By utilizing Azure’s Face API, facial recognition algorithms can quickly and accurately detect, verify, identify, and analyze faces. They can provide facial matching, facial attributes, and characteristic analysis in order to organize people and facial definitions
In the world of manufacturing, the Industrial Internet of Things (IIoT) has come, and that means data. A lot of data. Smart machines, equipped with sensors, add to the large quantity of data already generated from quality systems, MES, ERP and other production systems. All this data is being gathered in different formats and at different cadences making it nearly impossible to use—or to deliver business insights. Azure has mastered ingesting and storing manufacturing data with services such as Azure IoT Hub and Azure Data Lake, and now our partner Sight Machine has solved for the other huge challenge: data variety. Sight Machine on Azure is a leading AI-enabled analytics platform that enables manufacturers to normalize and contextualize plant floor data in real-time. The creation of these digital twins allows them to find new insights, transform operations, and unlock new value.
Data in the pre-digital world
Manufacturers are aware of the untapped potential of production data. Global manufacturers have begun investing in on-premises solutions for capturing and storing factory floor data. But these pre-digital world methods have many disadvantages. They result in siloed data, uncontextualized data (raw machine data with no connection to actual production processes), and limited accessibility (engineers
Putting the intelligent cloud to work for content creators, owners and storytellers.
Stories entertain us, make us laugh and cry, and are the lens through which we perceive our world. In that world, increasingly overloaded with information, they catch our attention and, if they catch our hearts, we engage. This makes stories powerful, and it’s why so many large technology companies are investing heavily in content – creating it and selling it.
At Microsoft, we’re not in the business of content creation.
Why? Our mission is to help every person and organization on the planet achieve more. So instead of creating or owning content, we want to provide platforms to help content creators and owners achieve more – from the Intelligent Cloud to the Intelligent Edge, with industry leading artificial intelligence (AI). We’re excited to see that mission come to life through customers such as Endemol Shine, Multichoice, RTL, Ericsson and partners like Avid, Akamai, Haivision, Pipeline FX and Verizon Digital Media Services. And we are excited to announce new Azure rendering, Azure Media Services, Video Indexer and Azure Networking capabilities to help you achieve more at NAB Show 2019. Cue scene.
Fix it in post: higher resolution, less
Want to train Video Indexer to recognize people relevant specifically to your account? We have great news for you!
Face detection and recognition are both very widely used insights that Video Indexer provides. The face recognition feature includes the ability to recognize around 1M celebrity faces out of the box and to train account level custom Person models to recognize non-celebrity people who are relevant to a customer’s specific organization. We received multiple requests from customers to further enhance the capabilities of custom Person models. Today, we are happy to announce a wealth of enhancements that makes custom Person model training and management faster and easier.
These enhancements include a centralized custom Person model management page that allows you to create multiple models in your account. Each of these models can hold up to 1M different people. From this page, you can create new models and add new people to existing models. Here, you can also review, rename, and delete your models if needed. On top of that, you can now train your account to identify people based on images of people’s faces even before you upload any video to your account (public preview). For instance, organizations that already have
Microsoft creates deep, technical content to help developers enhance their proficiency when building solutions using the Azure AI Platform. Our preferred training partners redeliver our LearnAI Bootcamps for customers around the globe on topics including Azure Databricks, Azure Machine Learning service, Azure Search, and Cognitive Services. Umanis, a systems integrator and preferred AI training partner based in France, has been innovating in Big Data and Analytics in numerous verticals for more than 25 years and has developed an effective methodology for guiding customers into the Intelligent Cloud. Here, Philippe Harel, the AI Practice Director at Umanis, describes this methodology and shares lessons learned to empower customers to do more with data and AI.
2019 is the year when artificial intelligence (AI) and machine learning (ML) are shifting from being mere buzzwords to real-world adoption and rollouts across the enterprise. This year reminds us of the cloud adoption curve a few years ago, when it was no longer an option to stay on-premises alone, but a question of how to make the shift. As you draw up plans on how to best use AI, here are some learnings and methodologies that Umanis is following.
Given the ever-increasing speed of
https://azure.microsoft.com/blog/new-updates-to-azure-ai-expand-ai-capabilities-for-developers/As companies increasingly look to transform their businesses with AI, we continue to add improvements to Azure AI to make it easy for developers and data scientists to deploy, manage, and secure AI functions directly into their applications with a READ MORE