The Azure Cosmos DB team is announcing the general availability of version 3 of the Azure Cosmos DB .NET SDK, released in July. Thank you to all who gave feedback during our preview.
In this post, we’ll walk through the latest improvements that we’ve made to enhance the developer experience in .NET SDK v3.
//Using .NET CLI dotnet add package Microsoft.Azure.Cosmos //Using NuGet Install-Package Microsoft.Azure.Cosmos What is Azure Cosmos DB?
Azure Cosmos DB is a globally distributed, multi-model database service that enables you to read and write data from any Azure region. It offers turnkey global distribution, guarantees single-digit millisecond latencies at the 99th percentile, 99.999 percent high availability, and elastic scaling of throughput and storage.
What is new in Azure Cosmos DB .NET SDK version 3?
Version 3 of the SDK contains numerous usability and performance improvements, including a new intuitive programming model, support for stream APIs, built-in support for change feed processor APIs, the ability to scale non-partitioned containers, and more. The SDK targets .NET Standard 2.0 and is open sourced on GitHub.
For new workloads, we recommend starting with the latest version
When data is the lifeblood of your business, you want to ensure your databases are reliable, secure, and available when called upon to perform. Service level agreements (SLA) set an expectation for uptime and performance, and are a key input for designing systems to meet business needs. We recently published a new version of the SQL Database SLA, guaranteeing the highest availability among relational database services as well as introducing the industry’s first business continuity SLA. These updates further cement our commitment to ensuring your data is safe and the apps and processes your business relies upon continue running in the face of a disruptive event.
As we indicated in the recent service update, we made two major changes in the SLA. First, Azure SQL Database now offers a 99.995% availability SLA for zone redundant databases in its business critical tier. This is the highest SLA in the industry among all relational database services. It is also backed by up to a 100% monthly cost credit for when the SLA is not maintained. Second, we offer a business continuity SLA for databases in the business critical tier that are geo-replicated between two different Azure regions. That SLA comes with very
https://azure.microsoft.com/blog/how-hsbc-built-its-payme-for-business-app-on-microsoft-azure/Bank-grade security, super-fast transactions, and analytics If you live in Asia or have ever traveled there, you’ve probably witnessed the dramatic impact that mobile technology has had on all aspects of day to day life. In Hong Kong in particular, READ MORE
Migrating hundreds of SQL Server instances and thousands of databases to Azure SQL Database, our Platform as a Service (PaaS) offering, is a considerable task, and to streamline the process as much as possible, you need to feel confident about your relative readiness for migration. Being able to identify low-hanging fruit including the servers and databases that are fully ready or that require minimal effort to prepare for migration eases and accelerates your efforts. We are pleased to share that Azure database target readiness recommendations have been enabled.
The Azure Migrate hub provides a unified view of all your migrations across the servers, applications, and databases. This integration provides customers with a seamless migration experience beginning during the discovery phase. The functionality allows customers to use assessment tools for visibility into the applications currently run on-premises so that they can determine cloud suitability and project the cost of running their applications in the cloud. It also allows customers to compare options between competing public and hybrid cloud options.
Assessing and viewing results
Assessing the overall readiness of your data estate for a migration to Azure SQL Database requires only a few steps:
Provision an instance of Azure Migrate, create
Tomorrow, July 9, 2019, marks the end of extended support for SQL Server 2008 and 2008 R2. These releases transformed the database industry, with all the core components of a database platform built-in at a fraction of the cost of other databases. We saw broad adoption across applications, data marts, data warehousing, and business intelligence. Thank you for the ten amazing years we’ve had together.
But now support for the SQL Server 2008 and R2 versions is ending. Whether you prefer the evergreen SQL of Azure SQL Database managed instance which never needs to be patched or upgraded, or if you need the flexibility and configurability of SQL Server hosted on a Azure Virtual Machine with three free years of Extended Security Updates, Azure provides the best choice of destinations to secure and modernize your database.
Customers are moving critical SQL Server workloads to Azure
Customers like Allscripts, Komatsu, Paychex, and Willis Towers Watson are taking advantage of these innovative destinations and migrating their SQL Server databases to Azure. Danish IT solutions provider KMD needed a home for their legacy SQL Server in the cloud. They had to migrate an 8-terabyte production database to the cloud quickly and without interruption
The emergence of the cloud and the edge as the new frontiers for computing is an exciting direction—data is now dispersed within and beyond the enterprise, on-premises, in the cloud, and at the edge. We must enable intelligent analysis, transactions, and responsible governance for data everywhere, from creation through to deletion (through the entire lifecycle of ingestion, updates, exploration, data prep, analysis, serving, and archival).
Our commitment to innovation is reflected in our unique collaborative approach to product development. Product teams work in synergy with research and advanced development groups, including Cloud Information Services Lab, Gray Systems Lab, and Microsoft Research, to push boundaries, explore novel concepts and challenge hypotheses.
The Azure Data team continues to lead the way in on-premises and cloud-based database management. SQL Server has been identified as the top DBMS by Gartner for four consecutive years. Our aim is to re-think and redefine data management by developing optimal ways to capture, store and analyze data.
I’m especially excited that this year we have three teams presenting their work: “Socrates: The New SQL Server in the Cloud,” “Automatically Indexing Millions of Databases in Microsoft Azure SQL Database,” and the Gray Systems Lab research team’s “Event Trend Aggregation
Data is rarely simple. Not every piece of data we have can fit nicely into a single Excel worksheet of rows and columns. Data has many diverse relationships such as the multiple locations and phone numbers for a single customer or multiple authors and genres of a single book. Of course, relationships typically are even more complex than this, and as we start to leverage AI to understand our data the additional learnings we get only add to the complexity of relationships. For that reason, expecting customers to have to flatten the data so it can be searched and explored is often unrealistic. We heard this often and it quickly became our number one most requested Azure Search feature. Because of this we were excited to announce the general availability of complex types support in Azure Search. In this post, I want to take some time to explain what complex types adds to Azure Search and the kinds of things you can build using this capability.
Azure Search is a platform as a service that helps developers create their own cloud search solutions.
What is complex data?
Complex data consists of data that includes hierarchical or nested substructures that do
This blog was co-authored by Shweta Mishra, Senior Solutions Architect, CitiusTech and Vinil Menon, Chief Technology Officer, CitiusTech
CitiusTech is a specialist provider of healthcare technology services which helps its customers to accelerate innovation in healthcare. CitiusTech used Azure Cosmos DB to simplify the real-time collection and movement of healthcare data from variety of sources in a secured manner. With the proliferation of patient information from established and current sources, accompanied with scrupulous regulations, healthcare systems today are gradually shifting towards near real-time data integration. To realize such performance, healthcare systems not only need to have low latency and high availability, but should also be highly responsive. Furthermore, they need to scale effectively to manage the inflow of high speed, large volumes of healthcare data.
The rise of Internet of Things (IoT) has enabled ordinary medical devices, wearables, traditional hospital deployed medical equipment to collect and share data. Within a wide area network (WAN), there are well defined standards and protocols, but with the ever increasing number of devices getting connected to the internet, there is a general lack of standards compliance and consistency of implementation. Moreover, data collation and generation from IoT enabled medical/mobile devices need specialized
https://azure.microsoft.com/blog/microsoft-fhir-server-for-azure-extends-to-sql/This blog post was co-authored by Doug Seven, Senior Director, Microsoft Health Engineering and Michael Hansen, Senior Program Manager, Microsoft Health Engineering. Since the launch of the open source FHIR Server for Azure on GitHub last November, we have been READ MORE