Many customers are using Azure Service Fabric to build and operate always-on, highly scalable, microservice applications. Recently, we open sourced Service Fabric with the MIT license to increase opportunities for customers to participate in the development and direction of the product. Today, we are excited to announce the release of Service Fabric runtime v6.2 and corresponding SDK and tooling updates.
This release includes:
The general availability of Java and .NET Core Reliable Services and Actors on Linux Public preview of Red Hat Enterprise clusters Enhanced container support Improved monitoring and backup/restore capabilities
The updates will be available in all regions over the next few days and details can be found in the release notes.
Reliable Services and Reliable Actors on Linux is generally available
Reliable Services and Reliable Actors are programming models to help developers build stateless and stateful microservices for new applications and for adding new microservices to existing applications. Now you can use your preferred language to build Reliable Services and Actors with the Service Fabric API using .NET Core 2.0 and Java 8 SDKs on Linux.
Red Hat Enterprise clusters in public preview
Azure Service Fabric clusters
We are happy to announce the Azure IaaS VM backup support for network restricted storage accounts. With storage firewalls and Virtual Networks, you can allow traffic only from selected virtual networks and subnets. With this you can create a secure network boundary for your unmanaged disks in storage accounts. You can also grant access for on-premises networks and other trusted internet traffic, by using network rules based on IP address ranges. With this announcement, we provide an ability for the user to perform and continue with scheduled and ad-hoc IaaS VM backups and restores for these VNET configured storage accounts.
After you configure firewall and virtual network settings for your storage account, select Allow trusted Microsoft services to access this storage account as an exception to enable Azure Backup service to access the network restricted storage account.
This network focused feature gives the customer a seamless experience by defining network access-based security. This ensures that only requests coming from approved Azure VNETs or specified public IP ranges will be allowed to a specific storage account making it more secure and thus fulfilling the compliance requirements of an organization.
Related links and additional content Learn more about
Today, we are delighted to announce increased scale limits for Azure Backup. Users can now create as many as 500 recovery services vaults in each subscription per region as compared to the earlier limit of 25 vaults per region per subscription. Customers who have been hitting the vault limits due to a restriction of 25 vaults can now go ahead and create vaults to manage their resources better. In addition, the number of Azure virtual machines that can be registered against each vault has been increased to 1,000 from the earlier limit of 200 machines under each vault.
Key benefits Better management of resources between departments in an organization: Flexibility to create a large number of vaults under a subscription and large number of containers under a vault based on the departmental requirements without worrying about hitting vault limits. Better granularity in reporting and monitoring of data within vaults: Users can create separate vaults as per their requirements segregated based on organizational needs and get more granular reporting of backup usage on a per vault basis. Systematic and comprehensive billing: Users can get vault level detailed billing for a subscription for better financial management within an organization. Related links and
Today we’re letting our customers know about our upcoming Data Subject Request (DSR) processing capability in the Azure portal, which will provide tenant admins a simple, powerful tool to quickly fulfill the Data Subject Requests that are central to compliance with the European Union General Data Protection Regulation (GDPR). We will fully support these DSR capabilities before May 25, 2018, the date when enforcement of the GDPR begins and when Microsoft has committed to be GDPR compliant across our cloud services.
The GDPR is the most significant change to EU privacy law in two decades and sets a new global standard for privacy rights, governing the handling and use of personal data. A fundamental tenet of the GDPR is the set of rights it grants individuals, or data subjects, in connection with their personal data collected by an organization (known as the data controller).
If your organization collects, hosts, or analyzes the personal data of EU residents, GDPR provisions require you to use data processors that guarantee their ability to implement the technical and organizational requirements of the GDPR. The GDPR also requires you to respond to requests from individuals, or data subjects, to receive a copy of their personal
Has your organization failed to devise a business continuity and disaster recovery plan because of the perception that it’s complex or expensive? Or perhaps you have a disaster recovery plan, but maybe you’re not testing it frequently enough because of concerns about impacting production systems.
If you’re in either category, you’ll want to start developing a plan now. Especially if your company does business in the European Union (EU) or might have any data on EU citizens. The General Data Protection Regulation (GDPR), which goes into effect May 25, 2018, is the EU’s new data protection regulation. While it doesn’t explicitly require that you back up data or implement a site recovery solution, the GDPR requirements provide additional reasons to stop waiting and fine-tune your DR plan:
Under the GDPR, data controllers and data processors must “provide a copy of the personal data undergoing processing”. (Article 15) According to the GDPR, companies must also have “the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident”. (Article 32) GDPR also grants EU citizens the “right to data portability”, (Article 20) which you can’t grant if you lose
So many customers I talk to want to upload their offline data stores into the cloud. Yet, no one wants to spend hours and hours inserting tapes, connecting older hard disks, or figuring out how to digitize and upload film. Well, I’m excited to announce that together with our partners Microsoft Azure is making it easy with our Offline Media Important Program. This partner-enabled service makes it easy to move data into Azure from almost any media, such as tapes, optical drives, hard disks, or film.
Why migrate your current storage media to Azure? Azure provides a range of flexible storage options from low-cost, archive storage to high-performance, SSD-based storage. You simply choose the storage tier and we take care of the rest. And once the data is available in Azure, higher-value scenarios around analysis, transformation, and distribution can be unlocked. Here are some of the common uses:
Media and entertainment
Offline media import is a great way for entertainment companies to modernize their content assets and take advantage of an array of cloud services such as cognitive services and media analytics. I’m actually at NAB this week talking to media companies about how this program can transform production workflows
One of the fastest areas of growth in cloud computing is around data storage. With a variety of workloads such as IoT telemetry, logging, media, genomics and archival driving cloud data growth, the need for scalable capacity, bandwidth, and transactions for storing and analyzing data for business insights, is more important than ever.
Up to 10x increase to Blob storage account scalability
We are excited to announce improvements in the capacity and scalability of standard Azure storage accounts, which greatly improves your experience building cloud-scale applications using Azure Storage. Effective immediately, via a request made to Azure Support, Azure Blob storage accounts or General Purpose v2 storage accounts can support the following larger limits. The defaults remain the same as before.
Max capacity for Blob storage accounts
5PB (10x increase)
Max TPS/IOPS for Blob storage accounts
50K (2.5x increase)
Max ingress for Blob storage accounts
5-20 Gbps (varies by region/ redundancy type)
50Gbps (up to 10x increase)
Max egress for Blob storage accounts
10-30 Gbps (varies
The preview for long-term backup retention in Azure SQL Database was announced in October 2016, providing you with a way to easily manage long-term retention for your databases – up to 10 years – with backups stored in your own Azure Backup Service Vault.
Based upon feedback gathered during the preview, we are happy to announce a set of major enhancements to the long-term backup retention solution. With this update we have eliminated the need for you to deploy and manage a separate Backup Service Vault. Instead, SQL Database will utilize Azure Blob Storage under the covers to store and manage your long-term backups. This new design will enable flexibility for your backup strategy, and overall more control over costs.
This update brings you the following additional benefits:
More regional support – Long-term retention will be supported in all Azure regions and national clouds. More flexible backup policies – You can customize the frequency of long-term backups for each database with policies covering weekly, monthly, yearly, and specific week-within-a-year backups. Management of individual backups – You can delete backups that are not critical for compliance. Streamlined configuration – No need to provision a separate backup service vault. What happens with
In today’s digital world where data is the new currency, protecting this data has become more important than ever before. In 2017, attackers had a huge impact on businesses as we saw a large outbreak of ransomware attacks like WannaCry, Petya and Locky. According to a report from MalwareBytes, ransomware detections were up 90 and 93 percent for businesses and consumers respectively in 2017. When a machine gets attacked by ransomware, backups are usually the last line of defense that customers resort to.
With increasing innovations in the ransomware space, attackers are no longer restricting themselves to only data corruption. Backups are becoming the next line of attack for ransomware tools and when you are in a situation with no data or its backups you end up being hostage to the ransomware attacker. This month saw the advent of a new ransomware, Zenis (found by the MalwareHunterTeam) which not only encrypts your files but also purposely deletes your backups.
Combating these attacks requires more than just taking the backup. Taking a backup is only the first step in your protection, it becomes important to safe guard those backups in this model.
Ask these questions to check how secure your
Today we are excited to announce the public preview of soft delete for Azure Storage Blobs! The feature is available in all regions, both public and private.
When turned on, soft delete enables you to save and recover your data where blobs or blob snapshots are deleted. This protection extends to blob data that is erased as the result of an overwrite.
How does it work?
When soft data is deleted, it transitions to a soft deleted state instead of being permanently erased. When soft delete is on and you overwrite data, a soft deleted snapshot is generated to save the state of the overwritten data. Soft deleted objects are invisible unless explicitly listed. You can configure the amount of time soft deleted data is recoverable before it is permanently expired.
Soft deleted data is grey, while active data is blue. More recently written data appears beneath older data. When B0 is overwritten with B1, a soft deleted snapshot of B0 is generated. When the blob is deleted, the root (B1) also moves into a soft deleted state.
Soft delete is backwards compatible; you don’t have to make changes to your applications to take advantage of the protections this