Category Archives : Announcements

05

Dec

Microsoft has validated the Lenovo ThinkSystem SE350 edge server for Azure Stack HCI

Do you need rugged, compact-sized hyperconverged infrastructure (HCI) enabled servers to run your branch office and edge workloads? Do you want to modernize your applications and IoT functions with container technology? Do you want to leverage Azure’s hybrid services such as backup, disaster recovery, update managment, monitoring, and security compliance?  

Microsoft and Lenovo have teamed up to validate the Lenovo ThinkSystem SE350 for Microsoft’s Azure Stack HCI program. The ThinkSystem SE350 was designed and built with the unique requirements of edge servers in mind. It is versatile enough to stretch the limitations of server locations, providing a variety of connectivity and security options and can be easily managed with Lenovo XClarity Controller. The ThinkSystem SE350 solution has a focus on smart connectivity, business security, and manageability for the harsh environment. To see all Lenovo servers validated for Azure Stack HCI, see the Azure Stack HCI catalog to learn more.

Lenovo ThinkSystem SE350:

The ThinkSystem SE350 is the latest workhorse for the edge. Designed and built with the unique requirements for edge servers in mind, it is versatile enough to stretch the limitations of server locations, providing a variety of connectivity and security options and is easily managed with Lenovo XClarity

Share

03

Dec

Extended filesystem programming capabilities in Azure Data Lake Storage

Since the general availability of Azure Data Lake Storage Gen2 in February 2019, customers have been getting insights at cloud scale faster than ever before. Integration to analytics engines is critical for their analytics workloads and equally important is the ability to programmatically ingest, manage, and analyze data. This ability is critical for key areas of enterprise data lakes such as data ingestion, event-driven big data platforms, machine learning, and advanced analytics. Programmatic access is possible today using Azure Data Lake Storage Gen2 REST APIs or Blob REST APIs. In addition, customers can enable continuous integration and continuous delivery (CI/CD) pipelines using Blob PowerShell and CLI capabilities via multi-protocol access. As part of the journey to enable our developer ecosystem, our goal is to make customer application development easier than ever before.

We are excited to announce the public preview of .NET SDK, Python SDK, Java SDK, PowerShell, and CLI for filesystem operations for Azure Data Lake Storage Gen2. Customers who are used to the familiar filesystem programming model can now implement this model using .NET, Python, and Java SDKs. Customers can also now incorporate these filesystem operations into their CI/CD pipelines using PowerShell and CLI, thereby enriching CI/CD pipeline

Share

02

Dec

Application Gateway Ingress Controller for Azure Kubernetes Service

https://azure.microsoft.com/blog/application-gateway-ingress-controller-for-azure-kubernetes-service/

Share

26

Nov

Multi-protocol access on Data Lake Storage now generally available

We are excited to announce the general availability of multi-protocol access for Azure Data Lake Storage. Azure Data Lake Storage is a unique cloud storage solution for analytics that offers multi-protocol access to the same data. This is a no-compromise solution that allows both the Azure Blob Storage API and Azure Data Lake Storage API to access data on a single storage account. You can store all your different types of data in one place, which gives you the flexibility to make the best use of your data as your use case evolves. The general availability of multi-protocol access creates the foundation to enable object storage capabilities on Data Lake Storage. This brings together the best of both object storage and Hadoop Distributed File System (HDFS) to enable scenarios that were not possible until today without data copy.

Broader ecosystem of applications and features

Multi-protocol access provides a powerful foundation to enable integrations and features for Data Lake Storage. Existing object storage applications and connectors can now be used to access data stored in Data Lake Storage with no changes. This vastly accelerated the integration of Azure services and the partner ecosystem with Data Lake Storage. We are also

Share

18

Nov

Introducing Azure Cost Management for partners

https://azure.microsoft.com/blog/introducing-azure-cost-management-for-partners/As a partner, you play a critical role in successful planning and managing long-term cloud implementations for your customers. While the cloud grants the flexibility to scale the cloud infrastructure to the changing needs, it does become challenging to control READ MORE

Share

13

Nov

GitHub Actions for Azure is now generally available
GitHub Actions for Azure is now generally available

GitHub Actions make it possible to create simple yet powerful workflows to automate software compilation and delivery integrated with GitHub. These actions, defined in YAML files, allow you to trigger an automated workflow process on any GitHub event, such as code commits, creation of Pull Requests or new GitHub Releases, and more.

As GitHub just announced the public availability of their Actions feature today, we’re announcing that the GitHub Actions for Azure are now generally available.

You can find all the GitHub Actions for Azure and their repositories listed on GitHub with documentation and sample templates to help you easily create workflows to build, test, package, release and deploy to Azure, following a push or pull request.

You can also use Azure starter templates to easily create GitHub CI/CD workflows targeting Azure to deploy your apps created with popular languages and frameworks including .NET, Node.js, Java, PHP, Ruby, or Python, in containers or running on any operating system.

Connect to Azure

Authenticate your Azure subscription using the Azure login (azure/login) action and a service principal. You can then run Azure CLI scripts to create and manage any Azure resource using the Azure CLI (azure/cli) action, which sets up the GitHub

Share

13

Nov

https://azure.microsoft.com/blog/azure-container-registry-preview-of-repository-scoped-permissions/The Azure Container Registry (ACR) team is rolling out the preview of repository scoped role-based access control (RBAC) permissions, our top-voted item on UserVoice. In this release, we have a command-line interface (CLI) experience for you to try and provide READ MORE

Share

11

Nov

Announcing the general availability of the new Azure HPC Cache service

If data-access challenges have been keeping you from running high-performance computing (HPC) jobs in Azure, we’ve got great news to report! The now-available Microsoft Azure HPC Cache service lets you run your most demanding workloads in Azure without the time and cost of rewriting applications and while storing data where you want to—in Azure or on your on-premises storage. By minimizing latency between compute and storage, the HPC Cache service seamlessly delivers the high-speed data access required to run your HPC applications in Azure.

Use Azure to expand analytic capacity—without worrying about data access

Most HPC teams recognize the potential for cloud bursting to expand analytic capacity. While many organizations would benefit from the capacity and scale advantages of running compute jobs in the cloud, users have been held back by the size of their datasets and the complexity of providing access to those datasets, typically stored on long-deployed network-attached storage (NAS) assets. These NAS environments often hold petabytes of data collected over a long period of time and represent significant infrastructure investment.

Here’s where the HPC Cache service can help. Think of the service as an edge cache that provides low-latency access to POSIX file data sourced from one

Share

11

Nov

https://azure.microsoft.com/blog/democratizing-agriculture-intelligence-introducing-azure-farmbeats/For an industry that started 12,000 years ago, there is a lot of unpredictability and imprecision in agriculture. To be predictable and precise, we need to align our actions with insights gathered from data. Last week at Microsoft Ignite, we READ MORE

Share

07

Nov

Sharing the DevOps journey at Microsoft
Sharing the DevOps journey at Microsoft

Today, more and more organizations are focused on delivering new digital solutions to customers and finding that the need for increased agility, improved processes, and collaboration between development and operation teams is becoming business-critical. For over a decade, DevOps has been the answer to these challenges. Understanding the need for DevOps is one thing, but the actual adoption of DevOps in the real world is a whole other challenge. How can an organization with multiple teams and projects, with deeply rooted existing processes, and with considerable legacy software change its ways and embrace DevOps?

At Microsoft, we know something about these challenges. As a company that has been building software for decades, Microsoft consists of thousands of engineers around the world that deliver many different products. From Office, to Azure, to Xbox we also found we needed to adapt to a new way of delivering software. The new era of the cloud unlocks tremendous potential for innovation to meet our customers’ growing demand for richer and better experiences—while our competition is not slowing down. The need to accelerate innovation and to transform how we work is real and urgent.

The road to transformation is not easy and we believe that

Share