It’s always been a tricky business to handle mission-critical processes. Much of the technical debt that companies assume comes from having to architect systems that have multiple layers of redundancy, to mitigate the chance of outages that may severely impact customers. The process of both architecting and subsequently maintaining these systems has resulted in huge losses in productivity and agility throughout many enterprises across all industries.
The solutions that cloud computing provides help enterprises shift away from this cumbersome work. Instead of spending countless weeks or even months trying to craft an effective solution to the problem of handling critical workloads, cloud providers such as Azure now provide an out-of-the-box way to run your critical processes, without fear of outages, and without incurring costs associated with managing your own infrastructure.
One of the latest innovations in this category, developed by the Azure Logic Apps team, is a new SAP connector that helps companies easily integrate with the ERP systems that are critical to the day-to-day success of a business. Often, implementing these solutions can take teams of people months to get right. However, with the SAP connector from Logic Apps, this process often only takes days, or even hours!
The emergence of the cloud and the edge as the new frontiers for computing is an exciting direction—data is now dispersed within and beyond the enterprise, on-premises, in the cloud, and at the edge. We must enable intelligent analysis, transactions, and responsible governance for data everywhere, from creation through to deletion (through the entire lifecycle of ingestion, updates, exploration, data prep, analysis, serving, and archival).
Our commitment to innovation is reflected in our unique collaborative approach to product development. Product teams work in synergy with research and advanced development groups, including Cloud Information Services Lab, Gray Systems Lab, and Microsoft Research, to push boundaries, explore novel concepts and challenge hypotheses.
The Azure Data team continues to lead the way in on-premises and cloud-based database management. SQL Server has been identified as the top DBMS by Gartner for four consecutive years. Our aim is to re-think and redefine data management by developing optimal ways to capture, store and analyze data.
I’m especially excited that this year we have three teams presenting their work: “Socrates: The New SQL Server in the Cloud,” “Automatically Indexing Millions of Databases in Microsoft Azure SQL Database,” and the Gray Systems Lab research team’s “Event Trend Aggregation
Most modern-day businesses employ analytics pipelines for real-time and batch processing. A common characteristic of these pipelines is that data arrives at irregular intervals from diverse sources. This adds complexity in terms of having to orchestrate the pipeline such that data gets processed in a timely fashion.
The answer to these challenges lies in coming up with a decoupled event-driven pipeline using serverless components that responds to changes in data as they occur.
An integral part of any analytics pipeline is the data lake. Azure Data Lake Storage Gen2 provides secure, cost effective, and scalable storage for the structured, semi-structured, and unstructured data arriving from diverse sources. Azure Data Lake Storage Gen2’s performance, global availability, and partner ecosystem make it the platform of choice for analytics customers and partners around the world. Next comes the event processing aspect. With Azure Event Grid, a fully managed event routing service, Azure Functions, a serverless compute engine, and Azure Logic Apps, a serverless workflow orchestration engine, it is easy to perform event-based processing and workflows responding to the events in real-time.
Today, we’re very excited to announce that Azure Data Lake Storage Gen2 integration with Azure Event Grid is in preview! This means
Microsoft accelerates application development with Azure Integration Services
Personal computers revolutionized the way work was done. New software unlocked unprecedented levels of productivity, and for a time, business flourished. As the personal computer exploded in popularity, more and more software was created. For the individual, this was a golden age. For the enterprise, this was also a golden age … with an asterisk.
As it was when you add more people to an organization, so too it was with software. Making software work cooperatively with other unrelated software ended up being a very tricky problem to solve. The more software that was added, the more that overhead was introduced. This was an unfortunate consequence. The cost of doing business increased, meaningful results decreased, and organizational productivity plummeted.
Large businesses and enterprises were locked in this pattern, until a new category of software was created, integration software. And for many years, on-premise integration tools, such as Microsoft BizTalk Server, helped mitigate the issue created by the rapid proliferation and adoption of new software.
And then one day, everything changed. The cloud was born, and with it, the need for new ways to connect everything together.
The adoption of cloud-native integration platforms
For businesses today, data is indispensable. Innovative ideas in manufacturing, health care, transportation, and financial industries are often the result of capturing and correlating data from multiple sources. Now more than ever, the ability to reliably ingest and respond to large volumes of data in real time is the key to gaining competitive advantage for consumer and commercial businesses alike. To meet these big data challenges, Azure Event Hubs offers a fully managed and massively scalable distributed streaming platform designed for a plethora of use cases from telemetry processing to fraud detection.
Event Hubs has been immensely popular with Azure’s largest customers and now even more so with the recent release of Event Hubs for Apache Kafka. With this powerful new capability, customers can stream events from Kafka applications seamlessly into Event Hubs without having to run Zookeeper or manage Kafka clusters, all while benefitting from a fully managed platform-as-a-service (PaaS) with features like auto-inflate and geo-disaster recover. As the front door to Azure’s data pipeline, customers can also automatically Capture streaming events into Azure Storage or Azure Data Lake, or natively perform real-time analysis on data streams using Azure Stream Analytics.
For customers with the most demanding streaming
Kubernetes is taking the app development world by storm. Earlier this month, we shared that the Azure Kubernetes Service (AKS) was the fastest growing compute service in Azure’s history. Customers like Siemens Healthineers, Finastra, Maersk, and Hafslund are realizing the benefits of using AKS to easily deploy, manage and scale applications without getting into the toil of maintaining infrastructures. As the community and adoption grows, Kubernetes itself is evolving, adding more enterprise-friendly features and extending to more scenarios. The release of production-level support for Windows Server containers is a true testament to the evolution.
Today, we’re excited to announce the preview of Windows Server containers in Azure Kubernetes Service (AKS) for versions 1.13.5 and 1.14.0. With this, Windows Server containers can now be deployed and orchestrated in AKS enabling new paths to migrate and modernize Windows Server applications in Azure.
Our customers have applications running on Linux and on Windows. The ability to manage Windows and the latest Linux containers side by side in the same Kubernetes cluster with the exact same APIs, tools and support is what you have been asking us to support, which opens an abundance of new scenarios. For example, you can now add Windows node pools to
Optimizing compute resource allocation to achieve performance goals while controlling costs can be a challenging balance to strike especially for database workloads with complex usage patterns. To help address these challenges, we are pleased to announce the preview of Azure SQL Database serverless. SQL Database serverless (preview) is a new compute tier that optimizes price-performance and simplifies performance management for databases with intermittent and unpredictable usage. Line-of-business applications, dev/test databases, content management, and e-commerce systems are just some examples across a range of applications that often fit the usage pattern ideal for SQL Database serverless. SQL Database serverless (preview) is also well-suited for new applications with compute sizing uncertainty or workloads requiring frequent rescaling in order to reduce costs. The serverless compute tier enjoys all the fully managed, built-in intelligence benefits of SQL Database and helps accelerate application development, minimize operational complexity, and lower total costs.
SQL Database serverless (preview) automatically scales compute for single databases based on workload demand and bills for compute used per second. Serverless contrasts with the provisioned compute tier in SQL Database which allocates a fixed amount of compute resources for a fixed price and is billed per hour. Over short time scales,
Starting the process of migrating to the cloud can be daunting. Legacy systems that are colossal in scale often overwhelm the average team tasked with the mission of digital transformation. How can they possibly untangle years of legacy code to start this new digital transformation initiative? Not only are these systems colossal in scale, but also colossal in terms of business importance. Enterprise applications like SAP and IBM, are integral to the daily rhythm of business. A seemingly simple mistake can result in catastrophic consequence.
Over the past year, Azure Integration Services has been reflecting on solutions to help with these challenges and we’re excited to announce new capabilities:
Developer focused – Improved the developer experience inside Logic Apps by allowing you to directly write code as a step inside a Logic App. Enterprise ready – Added new migration and modernization scenarios with the general availability of our new-and-improved SAP connector. Serverless first – Better integration between API Management and Azure Functions makes it even easier to create and manage serverless integrations and applications. The challenges facing customers
Over the past year, we’ve had the opportunity to meet with and hear from customers in-person to discuss the biggest challenges facing
Things are changing for the modern business. API-first development and microservices architecture is opening the door to new innovations. Many of these new approaches are possible in part due to the evolution of serverless technology, which eliminates the need for the management of infrastructure.
Fully managed infrastructure allows for allocating resources to solving a business problem, rather than managing the IT infrastructure. This results in more agility, reduced operating cost, and shorter time-to-market, which is important for organizations of any size.
Serverless is for all, no matter the size
The benefits serverless offers is independent of the size of the company. For example:
Startups need to quickly assess product-market fit and build prototypes to test their hypotheses.
With limited resources, startups can build, measure, and iterate their way to success with execution-based pricing models. Unlocks a new generation of startups, all built on the idea that a small group of people with a limited budget can be disruptive. As they evolve, they’ll benefit from serverless much in the same way as larger organizations do.
Enterprises need to adapt to constantly evolving customer requirements to stay competitive with agile, fast moving startups.
Serverless enables a business to grow without worrying about