Вы находитесь на странице: 1из 13

The full form of SLT is SAP Landscape Transformation which is nothing but a trigger

based replication. It is the replication technology to pass the data from the source
system to the target system. Here the source can be either SAP or non-SAP whereas
the Target system is SAP HANA system which contains HANA database.

State the different types of replication


The three types are as below:

1. ETL based replication using BODS

2. Trigger based replication using SLT
3. Extractor based data acquisition using DXC

Can we load and replicate data from one

source system to multiple target database
schemas of HANA system?
Yes. It can be loaded and replicated for up to 4 targets.

What are Transformation rules?

Added on June 5th 2015 by Guest

Transformation rule is a rule specified in the Advanced

Replication settings transaction for source tables such a way
that the data is transformed during the replication process. For
example one can specify rule to Convert fields, Fill empty fields
and Skip records

After the SLT replication is over what

Added on June 5th 2015 by Guest

After the replication is over, the SLT replication server creates 1

user, 4 roles, 2 stored procedures and 8tables as mentioned
1 User
1 Privilege
4 Roles


2 Stored procedures
8 Tables

What can be the maximum table name

length in SAP HANA ?
Added on June 6th 2015 by Ashok Kumar Reddy

Maximum table name length can be 127 characters .

What can be the maximum number of rows

in each HANA table?
Added on June 6th 2015 by Ashok Kumar Reddy

Limited by storage size RS: 1TB/sizeof(row)

CS: 2^31 * number of partitions

SAP HANA Architecture Overview:

The SAP HANA database is developed in C++ and runs on SUSE Linux
Enterpise Server. SAP HANA database consists of multiple servers and the
most important component is the Index Server. SAP HANA database consists
of Index Server, Name Server, Statistics Server, Preprocessor Server and XS

1 Overview of Transient provider, Virtual

provider and Composite provider in BW 7.4
DSO and InfoCube in BW 7.4 are HANA-Optimized by default. This
reduces the activation time and has a huge impact on the
performance .Similarly modeling HANA Views in BW on HANA into
Transient providers, Composite Providers and Virtual Providers
also has a huge impact on performance if the underlying database
is HANA.

1.1 Transient Provider

Key features

Its metadata in BW is not persisted, but always generated at

BEx Queries built on top can adapt to changes automatically as
far as possible.
Navigational attributes of an assigned Info Object cant be
used .
It cannot be used in Multi provider.
It can only be used in Composite Provider in order to merge with
other InfoProvider.
It is exposed to BEx and BI tools.

1.2 Composite Provider

Its biggest advantage that it performs both union and joins

operation which overcomes the limitation of Info Sets and Multi
provider in BW .

1.3 Virtual Provider

Key features

It offers a very flexible way of integrating data not stored in

BW Objects into the consistent BW world.
We can turn on Navigational Attributes for the Virtual Provider
as usual. Then we can map the Navigational Attribute also to a field
in the HANA model in the Provider specific properties if you do
not map it, the data comes from the Master data tables in BW.
Virtual Provider can be transported as usual, it can be used in a
Multi Provider and BEx Queries can be built on top, just like for any
other Virtual Provider.
The data read access at Query runtime is not via SQL, but it uses
the same special Analytics API that we use to access data in
InfoCubes, DSOs, and Multi Providers. This enables the BW Analytic
Manager to push down operations to HANA instead of performing them in
the application server.

3 Source Data Extraction (LO Extraction)

We will now see the basic feature of Extraction for Sales Header
data from EC6 system.
Log in to EC6 with proper username and password.

We will activate the data sources using a transaction code


Next we have to fill up the set up tables for the data

sources. For doing this first we need to delete the contents
that are already present in that setup tables.

Go to the transaction code LBWG and select the application

component whose content has to be deleted.
To check whether the setup table is deleted or not go to the
transaction code RSA3 and give the data source name and it will show
Record 0 found which means data has been deleted from the setup table.
Now using the transaction code LBWQ delete the extraction
queue of the data source.
Also delete the delta queue for the data sources using the
transaction code RSA7.
Next is to fill the setup table using the transaction code SBIW or
OLI*BW where * denotes the application component

Click on Logistics under Settings for Application-Specific

Data Sources--> Managing Extract Structures --->ApplicationSpecific Setup of Statistical Data.

Now click on the execute button of Perform setup Sales and

do it for delivery.

We will now see the set up table contents using transaction

code RSA3.

2 Source Data Extraction (using BO Data Services)

We will now see the basic feature of Extraction and Loading from
Flat file system to HANA using BO Data Services.

5 Modeling in HANA

We will now see the basic steps of modeling in HANA.

Log in to HANA Database via HANA Studio by clicking add

Analytic view is created on item table (VBAP) replicated by

BO Data Services.

NOTE: Make sure to mention the measure type Amount with

Currency for measure attribute like for
COST, NETVALUE, NETPRICEto avoid mismatch problem with the
predefined Info Objects while creating transient or virtual

CURR is a currency field of Item table (VBAP).

Modeling in BW on HANA
We will now see the basic steps of modeling Data marts in BW on

6.1 Transient Provider

Using T Code RSSDD_HM_PUBLISH, we create Analytic Index which is

Transient Provider in BW on HANA .

Using T Code RSSDD_LTIP, we see the transient provider that we have

already created.
Select the Analytic Index and click display.

While creating it, mention the Info Area.

Now we can mention the reference InfoObject for the
Reference InfoObject are used to copy the metadata structure
to attribute.
NOTE: Reference InfoObject is not mandatory in transient provider.

Composite Provider

Now we create a composite provider on Sales Header Data (HANA-Optimized

DSO) and Sales Item Data (Transient provider).
NOTE: We can model a Composite Provider either by using the T
Code RSLIMO / RSLIMOBW or the Workspace Designer

We can use either UNION or JOIN operands in modeling of Composite

Provider based on requirements which is an advantage over Multi
Provider (only union) and Infoset (only join).

Drag and Drop HANA-Optimized DSO from left panel to right

panel and use binding type as "UNION.
Drag and Drop Transient provider from left panel to right
panel and use binding type as "Inner JOIN.
NOTE: In composite provider, there has to be one provider
with binding type as UNION.

6.3 Virtual Provider

Next we create a virtual provider on HANA Analytic view.

Select and click Virtual provider based on HANA Model.

Mention the package name and view name.
Next we click the Assign HANA Model Attributes button in upper right
corner in order assign the attributes to Info Objects.
Select propose mapping checkbox and click continue
Select the required attributes and If not found then Click

For Manual mapping, right click the dimension and click provider
specific-Info Objects properties.
Similarly do the same for Key Dimensions.
Note: Make sure measure attributes like COST,
NETPRICE,NETVALUE have same data type with predefined Info
Objects .
Finally Virtual provider is created and we now check the data

Export and Import packages

Hi folks,
In our HANA landscape we have setup packages in content node based upon functional area. For example FI, MM,
PP, SD etc. Sometimes we may want to export/import a single view from a package because there could be multiple
developers working on views within the same package and the other views may not be ready to be promoted. In
these cases I have full access in the destination system for IMPORTED objects and NATIVE objects so I can do a
developer mode import and manually activate a single view if I want.
Now the issue I'm facing is we have a newer BW on HANA environment and I'm getting push back about the access
to ACTIVATE IMPORTED objects for example and I'm told to do everything via delivery unit. (I do not have full
administrative rights on our BWHANA as I do in our Enterprise HANA) This creates a big problem because when we
don't want to move all the views from development and I've had to move views out of the package in development to
a temporary package, do the export/import, and then copy the views back. Absolutely ugly.

SAP HANA Export and Import Views

Objective: You want to export the HANA views from one system to other in
your HANA landscape using the Developer mode.
You can also migrate the views using the "Delivery Units" method too.

Prerequisite: You need SYSTEM privilege EXPORT and IMPORT

assigned and you should have the package privilege of your source
Steps:How to Export and Import SAP HANA Views
EXPORT of Views.
1. In HANA studio , log in to the Source system.
2. Now Go to File - Export.
3. You will get following window. Here we are using Developer Mode to
perform the view transports , so select DEVELOPER MODE and press NEXT.

IMPORT of views in target system

1.In HANA Studio , login to the target system.

2. To import the content we exported , go to FILE - IMPORT
3. Select DEVELOPER MODE and press NEXT.
4. Select the target system,you should be logged in the system to
see it listed.
5. Browse to your exported content, select the views and press
ADD(shown below).
Make sure you have the same hierarchy structure displayed as in
your HANA - Content folder.
6.Once you have all your content added to right hand side ,select
the target folder and press FINISH to start the import.

SAP BW 7.4 is released and we are excited to check the new functionality in the
system. In this blog series we will collect the most interesting to us new or
enhanced features of the system Integration with SAP HANA database, Planning
in SAP BW, LSA++, Operational Data Provisioning (ODP) and so much more.
The new major release of SAP NetWeaver is released and we are excited to check what
are the new features and developments around SAP BW.


Logically, the major developments come around the integration between SAP BW and
the SAP HANA platform. So, SAP Netweaver BW 7.4 utilizes the SAP HANA as an
underlying database and facilitates the optimization of applications for SAP HANA.
Developers can now implement standard InfoCubes, standard DataStore objects and
semantically partitioned objects, which are optimized for SAP HANA. On top of that
existing objects can also be converted to SAP HANA optimized ones. The remodeling
feature is also enhanced, so that it can suit to the new objects.
New InfoProvider

The new InfoProvider is a VirtualProvider based on an SAP HANA model. Analytic views
and calculation views are available as SAP HANA models. This type of VirtualProvider
is capable to serve stable, long-term scenarios. Moreover, if you want a purely
virtualized model, there is the option to create virtual master data as well.

Planning Applications Kit

In SAP BW 7.4 we have a new technical implementation scenario for planning

applications. Now, besides the regular implementation model with Integrated Planning
on top of a RDBMS, the planning engine can also have read access to the SAP HANA
database or can use internal routines in the SAP HANA database for in-memory
optimized planning using ABAP Planning Applications Kit. In BW, you can perform
planning using internal database routines. You can check information further in SAP
Note 1637199.
Planning on a DataStore Object for Direct Update
With this release of SAP BW it is possible to use DataStore objects for direct update in
planning mode as planning-specific InfoProviders.

New Planning Function Type

Set Key Figure Values is available in planning. The function sets values for key figures
and distributes them according to reference data.
This is a completely new functionality, which allows business users to create models
using central data from the BW system and local data, and to react quickly to new and
changing requirements. The functionality targets users, who would like to create ad-hoc
scenarios or would like to make a rapid prototype of a reporting solution. The
environment is integrated into the existing BW landscape, but makes the typical BW
functions less complex.

DB Connect can be used now to replicate data to BW from any schema in the SAP
HANA database.
Using the ODP data replication interface, developers can transfer data from specific
SAP repositories (especially from SAP HANA and SAP Business ByDesign) to SAP
NetWeaver Business Warehouse.


As of this release developers will be able to directly load mass data into an InfoProvider,
without loading the data to the PSA before that. This is done through the ODP data
replication interface. A DTP setting is used to automatically activate master data after it
has been updated. This is possible because aggregates are not used with the SAP
Moreover, a query can be used as the data source of a data transfer process of type
full. This allows the extraction of query data and the subsequent distribution through an
Open Hub Destination. A possible target could be an SAP HANA database.

Advanced DSO
Hi Diva,
1. What is the difference between ADSO and OpenODS? ADSO and openODS both support field based reporting and
Infobject level reporting .What are the benefits of using and ADSO over an openODS for reporting?
Open ODS view is a virtual provider and ADSO is a info provider.

2. Can ADSO act as a virtual provider i.e can it report directly from an ECC source?
No. By using SLT you can report on a live data but can't be used as a virtual provider.

3. What are the partitioning options available in an ADSO?

check the option "Dynamic Range partitioning" feature for Advanced Data Store Objects. SPO is planned for future
As per SAP note 2044468 "Partitioning is only available for tables located in the column store. The row store
doesn't support partitioning." Also recommends partitioning, if there is a risk to reach the 2 billion record limit.

4. Is there a possibility to write partitioned data to disks in ADSO to reduce memory?

A given database table is either memory-based (HANA column or row tables), or disk-based (extended table of
dynamic tiering). It is not possible to create a single table that has partitions in memory and partitions in dynamic
tiering. Extended tables cannot be partitioned. When a partitioned in-memory column table is converted to an
extended table (ALTER TABLE USING EXTENDED STORAGE), the partitioning specification is dropped and the
extended table is created as a non-partitioned table that contains the contents of all partitions of the original table.

5. As far as I know the request handling technique is Different b/w aDSO and In-Memory Cube or In-Memory DSO.
Also in-bound queue in aDSO can be kept in the WARM data.