12

Nov

Five essential skills for today’s database administrators
Five essential skills for today’s database administrators

DevOps, the cloud, and new database technologies mean our jobs as database administrators (DBAs) are changing at an ever-faster pace. If you’re fascinated by data and all the things you can do with it, it’s a thrilling time to be in the business. Here are five of the skills we see as essential parts of the modern DBA’s toolkit.

Expertise with multiple technologies: The one-size-fits-all approach to databases is fading. Just as application developers are moving toward a microservices model that focuses on the right tool for the job, organizations are choosing databases according to specific workload needs. The more you know about Hadoop, NoSQL, graph, and other technologies, the better-positioned you will be to make a positive contribution to the conversation.Collaboration: Speaking of conversations, DBAs will increasingly need to become contributing members of application teams rather than siloed specialists off in their own corners. DevOps tend to break down the barriers between IT functions. Understanding how applications work, and, even better, how they deliver business value, puts you in a position to be a creative problem solver and all-around data expert.Data science skills: Machine learning and AI are some of the fastest growing uses of data in the enterprise today.

Share

12

Nov

Updated Ribbon Experience for Power BI Desktop
Updated Ribbon Experience for Power BI Desktop

https://powerbi.microsoft.com/en-us/blog/updated-ribbon-experience-for-power-bi-desktop/Source: https://powerbi.microsoft.com/en-us/blog/updated-ribbon-experience-for-power-bi-desktop/           The ribbon in Power BI Desktop has an updated and modern experience. When you turn on the “Updated ribbon” preview feature, you will see a ribbon that looks more like the ribbon in READ MORE

Share

12

Nov

FedRAMP Moderate Blueprints helps automate US federal agency compliance

We’ve just released our newest Azure Blueprints for the important US Federal Risk and Authorization Management Program (FedRAMP) certification at the moderate level. FedRAMP is a key certification because cloud providers seeking to sell services to US federal government agencies must first demonstrate FedRAMP compliance. Azure and Azure Government are both approved for FedRAMP at the high impact level, and we’re planning that a future Azure Blueprints will provide control mappings for high impact.

Azure Blueprints is a free service that helps enable customers to define a repeatable set of Azure resources that implement and adhere to standards, patterns, and requirements. Azure Blueprints allow customers to set up compliant environments matched to common internal scenarios and external standards like ISO 27001, Payment Card Industry data security standard (PCI DSS), and Center for Internet Security (CIS) Benchmarks.

Compliance with standards such as FedRAMP is increasingly important for all types of organizations, making control mappings to compliance standards a natural application for Azure Blueprints. Azure customers, particularly those in regulated industries, have expressed a strong interest in compliance blueprints to help ease the burden of their compliance obligations.

FedRAMP was established to provide a standardized approach for assessing, monitoring, and authorizing cloud

Share

11

Nov

Exclusive LIVE Community Event #2 – Ask Amir Anything

https://powerbi.microsoft.com/en-us/blog/exclusive-live-community-event-2-ask-amir-anything/Source: https://powerbi.microsoft.com/en-us/blog/exclusive-live-community-event-2-ask-amir-anything/           Next in the Power BI community Triple A series: Ask Amir Netz questions about the latest updates, features and future.

Share

11

Nov

Exclusive LIVE Community Event #2 – Ask Amir Anything

Next in the Power BI community Triple A series: Ask Amir Netz questions about the latest updates, features and future.

Share

11

Nov

Exclusive LIVE Community Event #2 – Ask Amir Anything

Next in the Power BI community Triple A series: Ask Amir Netz questions about the latest updates, features and future.

Share

11

Nov

Build an intelligent analytics platform with SQL Server 2019 Big Data Clusters

In the most recent releases, SQL Server went beyond relational data and enabled support for graph data, R, and Python machine learning, while making SQL Server available on Linux and containers in addition to Windows. At the same time, organizations are challenged with the amount of data stored in different formats, in silos, and the expertise required to extract value out of the data. Through enhancements in data virtualization and platform management, Microsoft SQL Server 2019 Big Data Clusters provides an innovative and integrated solution to overcome these difficulties. It incorporates Apache Spark and HDFS in addition to SQL Server, on a platform built exclusively using containerized applications, designed to derive new intelligent insights out of data.

Modernize your data estate with a scalable data virtualization and analytics platform

Data integration strategies are based on extract, transform, and load (ETL) results in data duplication and transformations that diminish data quality, higher maintenance, and security risks. SQL Server 2019 has a new approach to data integration called data virtualization across disparate and diverse data sources, without moving data. Out-of-the-box connectors for data sources like Oracle, Teradata or MongoDB help you keep the data in place and secure, with less maintenance and storage cost.

Share

11

Nov

Announcing the general availability of the new Azure HPC Cache service

If data-access challenges have been keeping you from running high-performance computing (HPC) jobs in Azure, we’ve got great news to report! The now-available Microsoft Azure HPC Cache service lets you run your most demanding workloads in Azure without the time and cost of rewriting applications and while storing data where you want to—in Azure or on your on-premises storage. By minimizing latency between compute and storage, the HPC Cache service seamlessly delivers the high-speed data access required to run your HPC applications in Azure.

Use Azure to expand analytic capacity—without worrying about data access

Most HPC teams recognize the potential for cloud bursting to expand analytic capacity. While many organizations would benefit from the capacity and scale advantages of running compute jobs in the cloud, users have been held back by the size of their datasets and the complexity of providing access to those datasets, typically stored on long-deployed network-attached storage (NAS) assets. These NAS environments often hold petabytes of data collected over a long period of time and represent significant infrastructure investment.

Here’s where the HPC Cache service can help. Think of the service as an edge cache that provides low-latency access to POSIX file data sourced from one

Share

11

Nov

https://azure.microsoft.com/blog/democratizing-agriculture-intelligence-introducing-azure-farmbeats/For an industry that started 12,000 years ago, there is a lot of unpredictability and imprecision in agriculture. To be predictable and precise, we need to align our actions with insights gathered from data. Last week at Microsoft Ignite, we READ MORE

Share

10

Nov

Power BI Desktop November 2019 Feature Summary
Power BI Desktop November 2019 Feature Summary

https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-november-2019-feature-summary/Source: https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-november-2019-feature-summary/           The November update has major updates in several areas of Power BI Desktop. There’s a new, modern ribbon that aligns Power BI Desktop with Office and adds more functionality. We’re also adding a READ MORE

Share