Because of existing and upcoming regulations, insurers perform quite a bit of analysis over their assets and liabilities. Actuaries need time to review and correct results before reviewing the reports with regulators. Today, it is common for quarterly reporting to require thousands of hours of compute time. Companies which offer variable annuity products must follow Actuarial Guideline XLIII which requires several compute intensive tasks, including nested stochastic modeling. Solvency II requires quite a bit of computational analysis to understand the Solvency Capital Requirement and the Minimum Capital Requirement. International Financial Reporting Standard 17 requires analysis of each policy, reviews of overall profitability, and more. Actuarial departments everywhere work to make sure that their financial and other models produce results which can be used to evaluate their business for regulatory and internal needs.
With all this reporting, actuaries get pinched for time. They need time for things like:
Development: Actuaries code the models in their favorite software or in custom solutions. Anything they can do to reduce the cycle of Code-Test-Review helps deliver the actuarial results sooner. Data preparation: Much of the source data is initially entered by hand. Errors need to be identified and fixed. If the errors can
No insurance company should keep on-premises compute grids for actuarial computing. In the past, resistance to the cloud went along these lines: the cloud has a lack of data security, the cloud is expensive, and no one has experience with the cloud. But those arguments are out of date. I have worked in and supported, compute grids at many different Insurance companies. Just before joining Microsoft, I led a project to move workloads to Azure and to decommission on-premises grids globally. At this point, all insurance companies see the increasing demand from growth in the number of policies processed, and new regulations that require changes to the actuarial and accounting systems. IFRS-17 requires changes to workflows, reporting and control throughout the actuarial and accounting process. Now is the time to move to a cloud-based solution on Azure.
Why wait to move to a cloud-based compute solution?
Over the years, I’ve worked in IT departments supporting actuaries, and in an actuarial department working with IT teams. I have seen three main blockers when moving to an all cloud-based solution. It always starts with the Business Information Security Officer (BISO) who has security and business continuity questions. Then the accounting, legal and
How to choose the right encryption technology for Azure SQL Database or SQL Server
Transparent Data Encryption (TDE) and Always Encrypted are two different encryption technologies offered by SQL Server and Azure SQL Database. Generally, encryption protects data from unauthorized access in different scenarios. They are complementary features, and this blog post will show a side-by-side comparison to help decide which technology to choose and how to combine them to provide a layered security approach.
Transparent Data Encryption
TDE is intended to add a layer of security to protect data at rest from offline access to raw files or backups, common scenarios include datacenter theft or unsecured disposal of hardware or media such as disk drives and backup tapes. For a deeper look into how TDE protects against the risk of malicious parties trying to recover stolen databases: data, log files, snapshots, copies or backups and to review TDE best practices see Feature Spotlight: Transparent Data Encryption (TDE).
Enabling TDE on databases provides the ability to comply with many laws, regulations, and security guidelines established across various industries that require data to be encrypted at rest. Unless data stored in a SQL database has no protection requirements at all,
I often follow several publications related to trends and emerging innovations in retail and consumer goods. Artificial Intelligence (AI) continues to be touted as a key ingredient in transforming this industry. I agree with this sentiment given the critical components of cloud computing and data availability, which combined create a case for modernization.
We are seeing real application of AI resulting in positive business improvements aimed at solving a range of service to production-type problems. These examples are tangible and exemplify the merits of AI and its applicability in retail and consumer goods. Take Macy’s virtual agent that can solve customer issues via the web and transfer customers seamlessly to a live agent if necessary. More than one-quarter of customer queries are answered by a virtual agent, improving the speed of service for customers and providing valuable data that is connected to back-end systems through Microsoft Dynamics 365 AI solution for customer service.
Deschutes Brewery is another great example. It’s the seventh-largest craft brewery in the United States. By partnering with OSIsoft PI System to collect and manage production data with Microsoft Cortana Intelligence Suite, they have estimated a 20 percent increase in production capacity leveraging existing equipment by implementing
We are excited to announce the August release of SQL Operations Studio is now available.
SQL Operations Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux. To learn more, visit our GitHub.
SQL Operations Studio was announced for Public Preview on November 15th at Connect(), and this August release is the ninth major update since the announcement. If you missed it, the July release announcement is available here.
Highlights for this release include the following.
Announcing the SQL Server Import extension SQL Server Profiler Session management New community extension: First responder kit Quality of Life improvements: Connection strings Bug bash galore
For complete updates, refer to the Release Notes.
Announcing the SQL Server Import extension
It all started from a simple idea: Take the #1 most used wizard in SSMS in the past year and bring this wizard to SQL Operations Studio. When we first released our Wizard and Dialog extensibility APIs in June, this was the perfect candidate to test our wizards and highlight to the community that
This post is authored by Tara Shankar Jana, Senior Technical Product Marketing Manager at Microsoft.
The user interface design process involves lots of creativity and iteration. The process often starts with drawings on a whiteboard or a blank sheet of paper, with designers and engineers sharing ideas and trying their best to represent the underlying customer scenario or workflow. Once a candidate design is arrived at, it’s usually captured via a photograph and then translated manually into a working HTML wireframe that works in a web browser. Such translation takes time and effort and it often slows down the design process.
What if the design could instead be captured from a whiteboard and be instantly reflected in a browser? If we could do that, at the end of a design brainstorm session we would have a readymade prototype that’s already been validated by the designer, developer and perhaps even the customer.
Introducing Sketch2Code – a web based solution that uses AI to transform a picture of a hand-drawn user interface into working HTML code.
Let’s take a closer look at the process of transforming hand-drawn images into HTML using Sketch2Code:
The user first uploads an image using our
Today more than ever, privacy is of critical importance in the technology industry. Microsoft has an enduring commitment to protect data privacy, not as an afterthought, but built into Microsoft Azure from the ground up. Microsoft designed Azure with industry-leading security controls, compliance tools, and privacy policies to safeguard your data in the cloud, including the categories of personal data identified by the GDPR. These also help you comply with other important global and regional privacy standards such as ISO/IEC 27018, EU-U.S. Privacy Shield, EU Model Clauses, HIPAA/HITECH, and HITRUST.
When you build on Azure’s secure foundation, you accelerate your move to the cloud by achieving compliance more readily, allowing you to enable privacy-sensitive cloud scenarios, such as financial and health service, with confidence.
In this episode we describe key tools in Azure to help you achieve your privacy goals that include:
The Azure Data Subject Requests for the GDPR portal, which provides step-by-step guidance on how to comply with GDPR requirements to find and act on personal data that resides in Azure. This capability to execute data subject requests is available through the Azure portal on our public and sovereign clouds, as well as through pre-existing APIs and
We are excited to share that Azure Backup now allows you to monitor all workloads protected by it by leveraging the power of Log Analytics (LA). This allows enterprises to monitor key backup parameters across Recovery Services vaults and subscriptions irrespective of which Azure backup solution you are using. In addition, configure custom alerts and actions for custom monitoring requirements for all Azure Backup workloads with this LA based solution.
This solution now covers all workloads protected by Azure Backup including Azure VMs, SQL in Azure VM backups, System Center Data Protection Manager connected to Azure (DPM-A), Microsoft Azure Backup Server (MABS), and file-folder backup from Azure backup agent.
Here’s how you get all the benefits.
Configure diagnostic settings
If you have already configured Log Analytics workspace to monitor Azure Backup, skip to the Deploy solution template section.
You can open the diagnostic setting window from the Azure Recovery services vault or from Azure Monitor. In the Diagnostic settings window, select “Send data to log analytics,” choose the relevant LA workspace and select the log accordingly, “AzureBackupReport,” and click “Save.”
Be sure to choose the same workspace for all the vaults so that you get a centralized view in the
The future of mobile banking is clear. People love their mobile devices and banks are making big investments to enhance their apps with digital features and capabilities. As mobile banking grows, so does the one aspect about it that can be wrenching for customers and banks, mobile device fraud.
Problem: To implement near real-time fraud detection
Most mobile fraud occurs through a compromise called a SIM swap attack in which a mobile number is hacked. The phone number is cloned and the criminal receives all the text messages and calls sent to the victim’s mobile device. Then login credentials are obtained through social engineering, phishing, vishing, or an infected downloaded app. With this information, the criminal can impersonate a bank customer, register for mobile access, and immediately start to request fund transfers and withdrawals.
Artificial Intelligence (AI) models have the potential to dramatically improve fraud detection rates and detection times. One approach is described in the Mobile bank fraud solution guide. It’s a behavioral-based AI approach and can be much more responsive to changing fraud patterns than rules-based or other approaches.
The solution: A pipeline that detects fraud in less than two seconds
Latency and response times are critical