Cloud data lakes solve a foundational problem for big data analytics—providing secure, scalable storage for data that traditionally lives in separate data silos. Data lakes were designed from the start to break down data barriers and jump start big data analytics efforts. However, a final “silo busting” frontier remained, enabling multiple data access methods for all data—structured, semi-structured, and unstructured—that lives in the data lake.
Providing multiple data access points to shared data sets allow tools and data applications to interact with the data in their most natural way. Additionally, this allows your data lake to benefit from the tools and frameworks built for a wide variety of ecosystems. For example, you may ingest your data via an object storage API, process the data using the Hadoop Distributed File System (HDFS) API, and then ingest the transformed data using an object storage API into a data warehouse.
Single storage solution for every scenario
We are very excited to announce the preview of multi-protocol access for Azure Data Lake Storage! Azure Data Lake Storage is a unique cloud storage solution for analytics that offers multi-protocol access to the same data. Multi-protocol access to the same data, via Azure Blob storage API
Co-locate your Azure resources for improved application performance
The performance of your applications is central to the success of your IT organization. Application performance can directly impact your ability to increase customer satisfaction and ultimately grow your business.
Many factors can affect the performance of your applications. One of those is network latency which is impacted, among other things, by the physical distance between the virtual machines deployed.
For example, when you place your Microsoft Azure Virtual Machines in a single Azure region, the physical distance between the virtual machines is reduced. Placing them within a single availability zone is another step you can take to deploy your virtual machines closer to each other. However, as the Azure footprint grows, a single availability zone may span multiple physical data centers resulting in network latency that can impact your overall application performance. If a region does not support availability zones or if your application does not use availability zones, the latency between the application tiers may increase as a result.
Today, we are announcing the preview of proximity placement groups. A new capability that we are making available to achieve co-location of your Azure Infrastructure as a Service (IaaS) resources and
In a world where data volume, variety, and type are exponentially growing, organizations need to collaborate with data of any size and shape. In many cases data is at its most powerful when it can be shared and combined with data that resides outside organizational boundaries with business partners and third parties. For customers, sharing this data in a simple and governed way is challenging. Common data sharing approaches using file transfer protocol (FTP) or web APIs tend to be bespoke development and require infrastructure to manage. These tools do not provide the security or governance required to meet enterprise standards, and they often are not suitable for sharing large datasets. To enable enterprise collaboration, we are excited to unveil Azure Data Share Preview, a new data service for sharing data across organizations.
Simple and safe data sharing
Data professionals in the enterprise can now use Azure Data Share to easily and safely share big data with external organizations in Azure Blob Storage and Azure Data Lake Storage. New services will continue to come online. As a fully managed Azure service, Azure Data Share does not require infrastructure to set up and it scales to meet big data sharing demands.
Last July, I shared our approach to helping customers migrate to Azure. Since then, we’ve seen tremendous customer response working with organizations such as Allscripts, Chevron, J.B. Hunt, and Carlsberg Beers, and we’ve gained valuable insights about customer needs along their journey. Today, we are bringing together a best practice-based, holistic experience for migrating existing applications and systems to Azure.
Azure Migration Program
Azure Migration Program includes prescriptive advice, resources, and tools customers need for a successful path to the cloud from start to finish. Using proven cloud adoption methodologies, tools, resources, and best practices, customers can ensure their move to Azure is successful. Through the program, customers will work hand in hand with Microsoft experts and specialized migration partners to receive:
Curated, step-by-step guidance from Microsoft experts and specialized migration partners based on proven Cloud Adoption Framework for Azure methodology. Technical skill building with foundational and role-specific courses to develop new Azure skills and ensue long-term organizational readiness. Free Azure migration tools including Azure Migrate to assess and migrate workloads. And free Azure Cost Management to optimize costs. Offers to reduce migration costs including Azure Hybrid Benefit, free Extended Security Updates for Windows Server 2008 and SQL Server 2008.
https://azure.microsoft.com/blog/how-azure-lighthouse-enables-management-at-scale-for-service-providers/Extending Azure Resource Manager with delegated resource management Today, Erin Chapple, Corporate Vice President, Microsoft Azure, announced the general availability of Azure Lighthouse, a single control plane for service providers to view and manage Azure across all their customers. Inspired READ MORE
Moving on-premises apps and data to the cloud is a key step in our customers’ migration journey, and we’re committed to helping simplify that process. Earlier this year, we invited customers to participate in the preview of multiple new migration capabilities. Today, I am excited to announce the latest evolution of Azure Migrate, which provides a streamlined, comprehensive portfolio of Microsoft and partner tools to meet migration needs, all in one place.
With the general availability of Azure Migrate, including the new integrated partner experience, Server Assessment, Server Migration, Database Assessment, and Database Migration capabilities, we strive to make the cloud journey even easier for customers. Azure Migrate acts as a central hub for all migration needs and tools from infrastructure to applications to data. We are truly democratizing the migration process with guidance and choice.
New Azure Migrate integrated experience
The new experience provides you access to Microsoft and ISV tools and helps identify the right tool for your migration scenario. To help with large-scale datacenter migrations and cloud transformation projects, we’ve also added end-to-end progress tracking.
New features include:
Guided experience for the most common migration scenarios such as server and database migration, data movement to Azure
Enabling partners to deliver differentiated managed services
Partners are at the heart of incredible innovation stories on Azure, lighting up the path to success for our customers. Today, we are launching the general availability of Azure Lighthouse, capabilities for cross customer management at scale for partners to differentiate and benefit from greater efficiency and automation. Our partners who used this new capability early on are fans!
Together with Gavriella Schuster, Corporate Vice President, One Commercial Partner, I’m extending our thanks to the Azure Expert MSP communities for their role in this journey with us on Azure Lighthouse. More than ever before, our customers count on our partners’ expertise in migrating mission-critical workloads to Azure, running scalable and dynamic applications and architecting modern apps with Azure Kubernetes Service (AKS). Azure Lighthouse is part of our ongoing investment to empower the ecosystem with efficient tools, so partners can grow profitably as their customer base reaches new peaks. Over the past year, we’ve sailed the seas of product development together to do so.
Enabling higher automation across the lifecycle of managing customers (from patching to log analysis, policy, configuration, and compliance), Azure Lighthouse brings scale and precision together for service providers at no
One of the newest members of the Azure AI portfolio, Form Recognizer, applies advanced machine learning to accurately extract text, key-value pairs, and tables from documents. With just a few samples, it tailors its understanding to supplied documents, both on-premises and in the cloud.
Introducing the new pre-built receipt capability
Form Recognizer focuses on making it simpler for companies to utilize the information hiding latent in business documents such as forms. Now we are making it easier to handle one the most commonplace documents in a business, receipts, “out of the box.” Form Recognizer’s new pre-built receipt API identifies and extracts key information on sales receipts, such as the time and date of the transaction, merchant information, the amounts of taxes, totals, and more, with no training required.
Streamlining expense reporting
Business expense reporting can be cumbersome for everyone involved in the process. Manually filling out and approving expense reports is a significant time sink for both employees and managers. Aside from productivity lost to expense reporting, there are also pain points around auditing expense reports. A solution to automatically extract merchant and transaction information from receipts can significantly reduce the manual effort of reporting and auditing expenses.
Better scale and more power for IT pros and developers.
Azure Files has always delivered secure, fully managed cloud file shares with a full range of data redundancy options. While customers love the simplicity of Azure Files and the hybrid capabilities of Azure File Sync, until now, scaling cloud file shares beyond 5 TiB required changing the paradigm for accessing data.
Today, we are excited to announce the preview of a larger and higher scale standard tier for Azure Files, now available to all Azure customers. This preview significantly improves your experience by increasing standard file shares’ capacity and performance limits. In select regions, standard file shares in general purpose accounts can support the following larger limits.
Azure Files standard storage scale limits Azure Files Before (standard tier) New (standard tier) Capacity per share 5 TiB 100 TiB (20x increase) Max IOPS per share 1,000 IOPS 10,000 IOPS (10x increase) Max throughput per share Up to 60 MiB/s Up to 300 MiB/s (5x increase)
Performance limits for a single file remain the same at 1 TiB, 1000 IOPS, and 60 MiB/s. Standard file shares are backed by hard disk drives. If your workload is latency sensitive, you should consider Azure
Our customers continue to use the Azure Data Box family to move massive amounts of data into Azure. One of the regular requests that we receive is for a larger capacity option that retains the simplicity, security, and speed of the original Data Box. Last year at Ignite, we announced a new addition to the Data Box family that did just that – a preview of the petabyte-scale Data Box Heavy
With thanks to those customers who provided feedback during the preview phase, I’m excited to announce that Azure Data Box Heavy has reached general availability in the US and EU!
How Data Box Heavy works
In many ways, Data Box Heavy is just like the original Data Box. You can order Data Box Heavy directly from the Azure portal, and copy data to Data Box Heavy using standard files or object protocols. Data is automatically secured on the appliance using AES 256-bit encryption. After your data is transferred to Azure, the appliance is wiped clean according to National Institute of Standards and Technology (NIST) standards.
But Data Box Heavy is also designed for a much larger scale than the original Data Box. Data Box Heavy’s one petabyte of raw