Category Archives : Hybrid

25

Sep

Red Hat OpenShift and Microsoft Azure Stack together for hybrid enterprise solutions

Red Hat and Microsoft expand partnership to offer complete combined hybrid cloud solutions – jointly supported and in market today.

This week at Ignite in Orlando, Microsoft and Red Hat demonstrated their solution for hybrid enterprise container platforms – OpenShift Container Platform for Microsoft Azure Stack. This solution has joint support from Microsoft and Red Hat and was first announced earlier this year in May.

Microsoft and Red Hat are both committed to customer solutions that span on-premises and public cloud.  Together, Azure and Azure Stack deliver the industry’s only truly consistent and comprehensive hybrid cloud platform, which enables a unified approach to application development. OpenShift Container Platform is Red Hat’s container application platform, bringing Docker and Kubernetes to the enterprise and creating consistent solutions both on-premises and in the cloud.

OpenShift and Azure Stack present exciting new options for customers who use Microsoft and Red Hat technologies and offer the greatest possible flexibility and consistency where these solutions are run and managed – whether its in the public cloud or on-premises with Azure Stack. OpenShift and Azure Stack enable a consistent application experience across Azure, Azure Stack, bare-metal, Windows and RHEL bringing together Microsoft’s and Red Hat’s developer frameworks

Share

27

Aug

Sharing a self-hosted Integration Runtime infrastructure with multiple Data Factories

The Integration Runtime (IR) is the compute infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments. If you need to perform data integration and orchestration securely in a private network environment, which does not have a direct line-of-sight from the public cloud environment, you can install a self-hosted IR on premises behind your corporate firewall, or inside a virtual private network.

Untill now, you were required to create at least one such compute infrastructure in every Data Factory by design for hybrid and on-premise data integration capabilities. Which implies if you have ten such data factories being used by different project teams to access on-premise data stores and orchestrate inside VNet, you would have to create ten self-hosted IR infrastructures, adding additional cost and management concerns to the IT teams.

With the new capability of self-hosted IR sharing, you can share the same self-hosted IR infrastructure across data factories. This lets you reuse the same highly available and scalable self-hosted IR infrastructure from different data factories within the same Azure Active Directory tenant. We are introducing a new concept of a Linked self-hosted IR which references another self-hosted IR infrastructure. This does not introduce

Share

27

Aug

Sharing a self-hosted Integration Runtime infrastructure with multiple Data Factories

The Integration Runtime (IR) is the compute infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments. If you need to perform data integration and orchestration securely in a private network environment, which does not have a direct line-of-sight from the public cloud environment, you can install a self-hosted IR on premises behind your corporate firewall, or inside a virtual private network.

Untill now, you were required to create at least one such compute infrastructure in every Data Factory by design for hybrid and on-premise data integration capabilities. Which implies if you have ten such data factories being used by different project teams to access on-premise data stores and orchestrate inside VNet, you would have to create ten self-hosted IR infrastructures, adding additional cost and management concerns to the IT teams.

With the new capability of self-hosted IR sharing, you can share the same self-hosted IR infrastructure across data factories. This lets you reuse the same highly available and scalable self-hosted IR infrastructure from different data factories within the same Azure Active Directory tenant. We are introducing a new concept of a Linked self-hosted IR which references another self-hosted IR infrastructure. This does not introduce

Share

20

Aug

Migrate Windows Server 2008 to Azure with Azure Site Recovery

For close to 10 years now, Windows Server 2008/2008 R2 has been a trusted and preferred server platform for our customers. With millions of instances deployed worldwide, our customers run many of their business applications including their most critical ones on the Windows Server 2008 platform.

With the end of support for Windows Server 2008 in January 2020 fast approaching, now is a great opportunity for customers running Windows Server 2008 to modernize your applications and infrastructure and take advantage of the power of Azure. But we know that the process of digital transformation doesn’t happen overnight. There are some great new offers that customers running their business applications on Windows Server 2008 can benefit from as they get started on their digital transformation journey in Azure. One of the options available to customers is the option to migrate servers running Windows Server 2008 to Azure and get extended security updates for three years past the end of support date, and this offer is available at no additional cost. In other words, if you choose to run your applications on Windows Server 2008 on Azure virtual machines, you get extended security updates for free. Further, with Azure Hybrid Benefit you

Share

26

Jul

Avoid Big Data pitfalls with Azure HDInsight and these partner solutions

According to a Gartner 2017 prediction, “60 percent of big data projects will fail to go beyond piloting and experimentation, these projects will be abandoned”.

Whether you worked on an analytical project or are starting one, it is a challenge on any cloud. You need to juggle the intricacies of cloud provider services, open source frameworks and the apps in the ecosystem. Apache Hadoop & Spark are very vibrant open source ecosystems which have enabled enterprises to digitally transform their businesses using data. According to Matt Turck VC at FirstMark, it has been an exciting but complex year in the data world. “The data tech ecosystem has continued to fire on all cylinders.  If nothing else, data is probably even more front and center in 2018, in both business and personal conversations”.

However, with great power comes greater responsibility from the ecosystem. There is a lot more than just using open source or a managed platform to a successful project. You have to deal with:

The complexity of combining all the open source frameworks. Architecting a data lake to get insights for data engineers, data scientists and BI users. Meeting enterprise regulations such as security, access control, data sovereignty &

Share

31

May

Receiving and handling HTTP requests anywhere with the Azure Relay

If you followed Microsoft’s coverage from the Build 2018 conference, you may have been as excited as we were about the new Visual Studio Live Share feature that allows instant, remote, peer-to-peer collaboration between Visual Studio users, no matter where they are. One developer could be sitting in a coffee shop and another on a plane with in-flight WiFi, and yet both can collaborate directly on code.

The “networking magic” that enables the Visual Studio team to offer this feature is the Azure Relay, which is a part of the messaging services family along with Azure Service Bus, Azure Event Hubs, and Azure Event Grid. The Relay is, indeed, the oldest of all Azure services, with the earliest public incubation having started exactly 12 years ago today, and it was amongst the handful of original services that launched with the Azure platform in January 2010.

In the meantime, the Relay has learned to speak a fully documented open protocol that can work with any WebSocket client stack, and allows any such client to become a listener for inbound connections from other clients, without needing inbound firewall rules, public IP addresses, or DNS registrations. Since all inbound communication terminates inside the

Share

23

May

An update on the integration of Avere Systems into the Azure family

It has been three months since we closed on the acquisition of Avere Systems. Since that time, we’ve been hard at work integrating the Avere and Microsoft families, growing our presence in Pittsburgh and meeting with customers and partners at The National Association of Broadcasters Show.

It’s been exciting to hear how Avere has helped businesses address a broad range of compute and data challenges, helping produce blockbuster movies and life-saving drug therapies faster than ever before with hybrid and public cloud options. I’ve also appreciated having the opportunity to address our customers questions and concerns and thought it might be helpful to share the most common ones with the broader Azure/Avere community:

When will Avere be available on Microsoft Azure? We are on track to release Microsoft Avere vFXT to the Azure Marketplace later this year.  With this technology Azure customers will be able to run compute-intensive applications completely on Azure or to take advantage of our scale on an as-needed basis. Will Microsoft continue to support the Avere FXT physical appliance? Yes, we will continue to invest in, upgrade and support the Microsoft Avere FXT physical appliance, which customers tell us is particularly important for their on-premise and hybrid environments.

Share

24

Apr

The edge of possibility: best practices for IoT-driven infrastructure transformation

Corporate IT infrastructure has changed a lot in the past decade. From a relatively simple bounded space, with a defined “inside” and “outside,” IT networks have evolved to incorporate a wider range of devices, such as smartphones and tablets, and a growing amount of traffic from additional diverse networks, including the public Internet. However, nothing has the potential to disrupt traditional infrastructure topologies more than the Internet of Things (IoT). This has implications for infrastructure and operations (I&O) teams, as well as developers who are responsible for IoT solutions. A recent Forrester report titled “Edge Computing: IoT Will Spawn A New Infrastructure Market” highlights many of the changes and challenges that must be faced in this rapid evolution. Let’s take a look at a few of the highlights.

Consider the full breadth of devices: The “things” that are connected in IoT require new approaches to development and management, but these endpoints are not the only new hardware you have to consider. Diverse components, including field-located IoT gateways and micro-datacenters, will become part of the networked environments. The need for edge infrastructure will depend on how much latency can be tolerated in the system and the complexity of the operations that

Share

29

Mar

Secure your backups, not just your data!

In today’s digital world where data is the new currency, protecting this data has become more important than ever before. In 2017, attackers had a huge impact on businesses as we saw a large outbreak of ransomware attacks like WannaCry, Petya and Locky. According to a report from MalwareBytes, ransomware detections were up 90 and 93 percent for businesses and consumers respectively in 2017. When a machine gets attacked by ransomware, backups are usually the last line of defense that customers resort to.

With increasing innovations in the ransomware space, attackers are no longer restricting themselves to only data corruption. Backups are becoming the next line of attack for ransomware tools and when you are in a situation with no data or its backups you end up being hostage to the ransomware attacker. This month saw the advent of a new ransomware, Zenis (found by the MalwareHunterTeam) which not only encrypts your files but also purposely deletes your backups.

Combating these attacks requires more than just taking the backup. Taking a backup is only the first step in your protection, it becomes important to safe guard those backups in this model.

Ask these questions to check how secure your

Share

08

Feb

First System Center Semi-Annual Channel release now available

I am excited to announce that System Center, version 1801 is now available. Based on customer feedback, we are delivering new features and enhancements in this release including improved Linux monitoring support, more efficient VMware backup, additional support for Windows Server, and improved user experience and performance. 

System Center, version 1801 is the first of our Semi-Annual Channel releases delivering new capabilities at a faster cadence. Semi-Annual Channel releases have an 18-month support policy. In addition, we will continue to release in the Long-Term Servicing Channel (LTSC) at a lower frequency. The LTSC will continue to provide 5 years of mainstream support followed by 5 more years of extended support.

What’s in System Center, version 1801?

System Center, version 1801 focuses on enhancements and features for System Center Operations Manager, Virtual Machine Manager, and Data Protection Manager. Additionally, security and bug fixes, as well as support for TLS 1.2, are available for all System Center components including Orchestrator, Service Management Automation, and Service Manager.

I am pleased to share the capabilities included in this release:

Support for additional Windows Server features in Virtual Machine Manager: Customers can now setup nested virtualization, software load balancer configuration, and storage QoS configuration and

Share