According to a Gartner 2017 prediction, “60 percent of big data projects will fail to go beyond piloting and experimentation, these projects will be abandoned”.
Whether you worked on an analytical project or are starting one, it is a challenge on any cloud. You need to juggle the intricacies of cloud provider services, open source frameworks and the apps in the ecosystem. Apache Hadoop & Spark are very vibrant open source ecosystems which have enabled enterprises to digitally transform their businesses using data. According to Matt Turck VC at FirstMark, it has been an exciting but complex year in the data world. “The data tech ecosystem has continued to fire on all cylinders. If nothing else, data is probably even more front and center in 2018, in both business and personal conversations”.
However, with great power comes greater responsibility from the ecosystem. There is a lot more than just using open source or a managed platform to a successful project. You have to deal with:
The complexity of combining all the open source frameworks. Architecting a data lake to get insights for data engineers, data scientists and BI users. Meeting enterprise regulations such as security, access control, data sovereignty &
If you followed Microsoft’s coverage from the Build 2018 conference, you may have been as excited as we were about the new Visual Studio Live Share feature that allows instant, remote, peer-to-peer collaboration between Visual Studio users, no matter where they are. One developer could be sitting in a coffee shop and another on a plane with in-flight WiFi, and yet both can collaborate directly on code.
The “networking magic” that enables the Visual Studio team to offer this feature is the Azure Relay, which is a part of the messaging services family along with Azure Service Bus, Azure Event Hubs, and Azure Event Grid. The Relay is, indeed, the oldest of all Azure services, with the earliest public incubation having started exactly 12 years ago today, and it was amongst the handful of original services that launched with the Azure platform in January 2010.
In the meantime, the Relay has learned to speak a fully documented open protocol that can work with any WebSocket client stack, and allows any such client to become a listener for inbound connections from other clients, without needing inbound firewall rules, public IP addresses, or DNS registrations. Since all inbound communication terminates inside the
It has been three months since we closed on the acquisition of Avere Systems. Since that time, we’ve been hard at work integrating the Avere and Microsoft families, growing our presence in Pittsburgh and meeting with customers and partners at The National Association of Broadcasters Show.
It’s been exciting to hear how Avere has helped businesses address a broad range of compute and data challenges, helping produce blockbuster movies and life-saving drug therapies faster than ever before with hybrid and public cloud options. I’ve also appreciated having the opportunity to address our customers questions and concerns and thought it might be helpful to share the most common ones with the broader Azure/Avere community:
When will Avere be available on Microsoft Azure? We are on track to release Microsoft Avere vFXT to the Azure Marketplace later this year. With this technology Azure customers will be able to run compute-intensive applications completely on Azure or to take advantage of our scale on an as-needed basis. Will Microsoft continue to support the Avere FXT physical appliance? Yes, we will continue to invest in, upgrade and support the Microsoft Avere FXT physical appliance, which customers tell us is particularly important for their on-premise and hybrid environments.
Corporate IT infrastructure has changed a lot in the past decade. From a relatively simple bounded space, with a defined “inside” and “outside,” IT networks have evolved to incorporate a wider range of devices, such as smartphones and tablets, and a growing amount of traffic from additional diverse networks, including the public Internet. However, nothing has the potential to disrupt traditional infrastructure topologies more than the Internet of Things (IoT). This has implications for infrastructure and operations (I&O) teams, as well as developers who are responsible for IoT solutions. A recent Forrester report titled “Edge Computing: IoT Will Spawn A New Infrastructure Market” highlights many of the changes and challenges that must be faced in this rapid evolution. Let’s take a look at a few of the highlights.
Consider the full breadth of devices: The “things” that are connected in IoT require new approaches to development and management, but these endpoints are not the only new hardware you have to consider. Diverse components, including field-located IoT gateways and micro-datacenters, will become part of the networked environments. The need for edge infrastructure will depend on how much latency can be tolerated in the system and the complexity of the operations that
In today’s digital world where data is the new currency, protecting this data has become more important than ever before. In 2017, attackers had a huge impact on businesses as we saw a large outbreak of ransomware attacks like WannaCry, Petya and Locky. According to a report from MalwareBytes, ransomware detections were up 90 and 93 percent for businesses and consumers respectively in 2017. When a machine gets attacked by ransomware, backups are usually the last line of defense that customers resort to.
With increasing innovations in the ransomware space, attackers are no longer restricting themselves to only data corruption. Backups are becoming the next line of attack for ransomware tools and when you are in a situation with no data or its backups you end up being hostage to the ransomware attacker. This month saw the advent of a new ransomware, Zenis (found by the MalwareHunterTeam) which not only encrypts your files but also purposely deletes your backups.
Combating these attacks requires more than just taking the backup. Taking a backup is only the first step in your protection, it becomes important to safe guard those backups in this model.
Ask these questions to check how secure your
I am excited to announce that System Center, version 1801 is now available. Based on customer feedback, we are delivering new features and enhancements in this release including improved Linux monitoring support, more efficient VMware backup, additional support for Windows Server, and improved user experience and performance.
System Center, version 1801 is the first of our Semi-Annual Channel releases delivering new capabilities at a faster cadence. Semi-Annual Channel releases have an 18-month support policy. In addition, we will continue to release in the Long-Term Servicing Channel (LTSC) at a lower frequency. The LTSC will continue to provide 5 years of mainstream support followed by 5 more years of extended support.
What’s in System Center, version 1801?
System Center, version 1801 focuses on enhancements and features for System Center Operations Manager, Virtual Machine Manager, and Data Protection Manager. Additionally, security and bug fixes, as well as support for TLS 1.2, are available for all System Center components including Orchestrator, Service Management Automation, and Service Manager.
I am pleased to share the capabilities included in this release:
A few months ago, we announced we were performing a compliance assessment on Microsoft Azure Stack, today we are happy to share that the compliance assessment is done and available to you.
Knowing that preparing compliance paperwork is a tedious task, we precompiled the documentation for our customers. Since Azure Stack is delivered as an integrated system through hardware partners, we are in a unique position to perform a formal compliance assessment of Azure Stack that applies to all our customers. This resulted in a set of precompiled compliance documents that customers can now use to accelerate their compliance certification process.
We are glad to announce that Coalfire, a Qualified Security Assessor (QSA) and independent auditing firm, has audited and evaluated Azure Stack Infrastructure against the technical controls of PCI-DSS and the CSA Cloud Control Matrix, and found that Azure Stack satisfies the applicable controls.
In the assessor’s words:
“It is Coalfire’s opinion that Microsoft Azure Stack integrated system, reviewed between July 2017 and October 2017, can be effective in creating a PCI DSS compliant infrastructure and to assist in a comprehensive program of compliance with PCI DSS version 3.2.”
“It is Coalfire’s opinion that Microsoft Azure Stack as deployed