Saturday, 31 May 2014

SAP Bank Analyzer – implementation


SAP Bank Analyzer Implementation Strategy -High Level Implementation / Overview


 SAP Bank Analyzer –

This article will explain best approach and Implementation approach for the SAP Bank Analyzer solutions, upgrade or integration. Recently I've had opportunity to work on two demo solutions for a banking system SAP Bank analyzer implementation within risk and profit analyses team and a short term test strategy development for rapid SAP bank analyzer implementation within an investment bank.

This article will explain best approach to implement SAP Bank Analyzer solution 8.0 or  upgrade from or integration. Recently we worked on two demo solutions for a banking risk system SAP Bank analyzer implementation within risk and profit analyses team and a short term test strategy development for rapid SAP bank analyzer implementation within an investment bank.

In next couple of blogs we will detail an effective way to test SAP Bank analyzer and its individual components.

 - SAP BA - Objectives in upgrade and in new Implementation

                 Business Architecture      
                Test Approach

There could be two scenarios either – Accounting IFRS implementation or risk architecture – see below diagram for both .. 

A)  Accounting IFRS implementation (accounting consequence, SGL scenarios)






B) Risk architecture



Lets review each of the element in the diagram
See SAP SDN for SDS -

 First later is SD component to load original data from other operational systems or source systems into the Source Data Layer (SDL) by means of an extraction, transformation, and loading process (ETL process).  The SDL saves, consolidates, and manages the original data. At the same time it provides interfaces to additional operational systems.

The primary objects of the Source Data Layer (SDL) and their scenario versions are a flexible way of saving master data, flow data, and market data. They also group this data into units that belong together logically from a business perspective. This ensures that the Bank Analyzer components that are linked to the SDL have a standard, consistent data source.

In addition to storing primary object data, the SDL provides the following primary objects functions for applications linked to it:






     Tools

Integration


The SDL provides both the central original data basis and a part of the underlying infrastructure for linked applications. It is therefore a key element in ensuring the consistency of data and results.


<To be updated soon) - if you need detail document, please email your request


Category 1 - business transactions





  • Post external business transactions
  • Update secondary business Transactions

Category 2 - date valuation
 



  • Key date valuation for all financial positions
  • Financial Positions for Period (PICC) event processing etc
  • Update costing for period before GL and during transferring GL documents
  • Calculation for all PICC

Category 3 – SAP GL Postings



  • GL Connector and document generation – Journal and document review and finance reconciliation of the GL, Profit centre and any material code posting review
  • Balance processing step 1 - (SV)
  • Balance processing step 2 - (ATP)


Testing of the accounting for FI with Aggregation (AFI) for Accounts Results before RDL and GL


Category 1 – position management




  • Generate Finance Instruments position
  • Aggregate Financial Transactions
  • Prepare Business Transactions for Aggregation
  • Process Reassignments for Business Transactions
  • Aggregate Business Transactions
  • Aggregate Current Accrual Results
  • Process Reassignments for Accrual results
  • Aggregate Retroactive
  • Accrual Results
  • Derive Application Events for source data aggregation

Imported Sub-Ledger Documents (ISD) for Results

 ISD sub-ledger import

Data Warehouse Testing Recommendations

Data Warehouse Testing Recommendations

A data warehouse / business intelligence system is challenging to test. Standard testing methodology tests one little thing at a time, but a DW/BI system is all about integration and complexity, not to mention large data volumes. Here are my top five recommendations for building and executing a testing environment for your DW/BI project.

1. Create a small static test database, derived from real data.
You want it to be small so tests can run quickly. You want it to be static so that the expected results are known in advance. And you want it to be derived from real data because there’s nothing like real data to give you a realistic combination of scenarios, both good and bad. You will need to “cook” additional rows into the test database to test any branch of your ETL code that covers a data scenario not included in the original test data.

2. Test early and often.
Start testing as soon as you write a line of code (or connect two boxes in your ETL tool’s user interface). Developers do this all the time, of course, developing and running unit tests to ensure their code does what it’s supposed to do. Many developers aren’t as good about keeping track of all those tests, and running them often. Daily. Every time you check in code. If you run your tests daily, and prioritize fixing any tests that broke yesterday, it will be easy to determine what went wrong.
Unit testing assures that a developer’s code works as designed. System testing ensures that the entire system works, end to end, according to specifications. System testing should also begin early. There’s an official test phase before you go live; this test phase is for running tests and fixing problems, not for identifying what the tests should be and how to run them. Start system testing early in the development process, so all the kinks are worked out long before the pressure-cooker system testing phase begins.
3. Use testing tools and automate the test environment.
The suggestion to test early and often is practical only if you automate the process. No developer is going to spend the last hour of the work day babysitting unit tests! And few teams can afford a full time tester to do that work on the developers’ behalf.
To automate testing, you need tools. Many organizations will already have system quality assurance testing tools in place. If you don’t, or if you’re convinced your tools won’t meet the needs of the DW/BI system testing, try googling “software quality assurance tools” for an overwhelming list of products and methodologies available at a wide range of costs.
All commercial software test tools will allow you to enter tests, execute tests, log the results of test runs, and report on those results. For unit testing and data quality testing, define tests to run a query in the source and target data warehouse. You’re looking for row counts and amounts to match up.
A testing tool used for DW/BI testing must be able to run a script that sets up the test environment before the tests are run. Tasks you may need to execute include:
  • Restoring a virtual machine environment with clean test data
  • Modifying the static test data with special rows to test unusual conditions
  • Running your ETL program
After the tests are executed and logged, end with a cleanup script, which may be as simple as dropping the VM environment.
Standard testing methodology has you change one thing, run a test, and log results. In the DW/BI world, you should expect to group together many tests into a test group. Even with a tiny test database, you don’t want to execute your ETL code for each of the hundreds of unit tests that you should be running.

4. Enlist the business users to define system tests.
We need the business users’ expertise to define good system tests. How do we know the data is correct? How do we know that query performance meets their expectations? Enlisting business users in the test specification process will ensure better testing than if the DW/BI team just made up tests based on what they think is interesting. Engaging key business users in the quality assurance process also provides a huge credibility boost.
5. The test environment must be as similar as possible to the production environment.
It is vitally important that the test environment be similar to production. Ideally, it’s exactly the same hardware, software, and configuration. In the real world, relatively few organizations have the budget for two big DW servers. But any organization can, and should, make the following elements matchup:
  • Drive configuration (relative names for drives). Disk is cheap and you should be able to duplicate your disk for test. But if you can’t, at least make the drive letters and database file layout the same. Many people have whined at me that this is so much work to change the environment and make them the same. Yes it is! And so much better to do it now than in the final testing phase of your project.
  • Software versions from the operating system to the database to the users’ desktops, and everywhere in between.
  • Server layout. If the reporting system software will be on its own server in production, test it that way.
  • Security roles and privileges for back room service accounts. Your deployment is virtually guaranteed to fail if you don’t test security roles first. I don’t know why, but it always seems to go wrong.

If you follow these suggestions, especially the suggestion for continuous testing, you are likely to have a smooth, crisis-free test phase and move into production on schedule. If not, you’re running a serious risk of having an otherwise fabulous project be delayed interminably in QA hell, with business users and management rattling the doors.

SAP S/4HANA -Financial Services for Intelligent Enterprise - IFRS

About Us- SAP TAO Ltd - IT Consulting & Services

 SAP TAO Ltd - IT Consulting & Services SAP TAO is a specialist provider of SAP and Non SAP software development and SAP Enterprise int...