Вы находитесь на странице: 1из 29

SAP Stress Test with .

NET - Part II - How to

use it
Posted by Hynek Petrak Dec 4, 2013
First part is here: http://scn.sap.com/community/interoperability-microsoftnet/blog/2013/11/28/sap-stress-test-with-net--part-i

Part II: How to use the tool

You may download the installation here: SAP Stress Tool 1.9.5
It's been tested on few systems, but still I consider it as a Beta version.
Read the below License conditions - The tool is provided as is, without any warranty
or liability for any kind of damage. Do not test on production systems!!! Do not
test without a permit from your support team and management!
It's almost impossible to use the tool without reading the below instructions, read carefully.

Copyright & License

Read the license.txt installed along with the tool for detailed license and copyright

Download the setup and run it. The tool requires Microsoft .NET 4.0 to be installed. The tool
will by default install under user's local application data folder. It is not required to have
administrative privileges to install the tool.
Requirement: The application requires .NET Connector 3.0 32bit version, which is licensed
It has to be downloaded from http://service.sap.com/connectors.
After .NET Connector installation, please locate the sapnco.dll and sapnco_utils.dll and copy
over into the SAP Stress Tool application directory.
Tested on Windows XP and Windows 7. The tools comes as 32bit .NET application. It runs on
both 32bit as well as 64bit environments. The .NET connector is installed along with the
tool, so separate installation is not required.

The configuration is done via an XML file "config.xml". There are no plans to build any kind
of user interface to edit the configuration. Use notepad or XML editor of your choice.
After installation to the target folder, you'll find ConfigExample.xml installed. You may
rename/copy it into a new Config.xml. Who know how to deal with XML Schemas, there is
also "CfgSchema.xsd" installed.
Example 1: Config.xml for single server environment and load balanced environment.


<?xml version="1.0" encoding="utf-16"?>


<Config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xs

d="http://www.w3.org/2001/XMLSchema" xmlns="http://tempuri.org/CfgSchema.xsd">




<System ID="SPD" SystemNumber="00" Client="122" User="Username"


Password="password:put_pwd_here" Server="server1.example.com">


<SalesOrder number="2713100768">




<SalesOrder number="3113009124">



10. <SalesOrder number="3113009114">

11. </SalesOrder>
12. </System>
13. <System ID="PRALB" SystemNumber="00" Client="200" User="Username"
14. Password="password:your_password" Server="message_server.example.com"
15. LoadBalancing="true" LogonGroup="BRIDGE" MsgSrvPort="3600">
16. <Properties>
17. <Property Name="UseRFC_READ_TABLE" Value="No" />
18. </Properties>
19. <SalesOrder number="2713100768" >
20. </SalesOrder>
21. <SalesOrder number="3113009124">
22. </SalesOrder>
23. <SalesOrder number="3113009114">
24. </SalesOrder>
25. </System>
26. </Systems>
27. </Config>

Config file description

You may put as many system configurations <System /> as you want. Those do
not interfere each other and are treated separately.

At run time only one System configuration can be active

In the above example you can see two systems configured

SPD = system with direct connection to application server

PRALB = system with load balancing. Connection is done via message

server and specific logon greoup (here: BRIDGE). Do not forget to update your
"services" file with the appropriate message server port!

When you configure a system for a first time, put you password as
"password:your_password", wherepassword: is a fixed keyword. When the tool is run,
your password will be automatically encrypted and stored in the confix.xml encrypted.

It is advised to create special user accounts for testing with limited validity. Even
though the password is encrypted, it's being decrypted at run-time, so it shall be
considered as compromised, if the confix.xml is shared with another person.

Use the <SalesOrder number="2713100768" /> tag to list some sales orders.
Those will be later used for simulation. The sales orders must exist in the target system
and the number must contain leading zeroes. The sales order are assumed to be standard
and simple types. The tool cannot simulate advanced order type, where you need to
provide additional conditions manually. You will get errors then in one of the log files.

How to deal with the user interface

Before the first run make sure, that you know, what you are going to do ...


From the "Target system:" drop down menu select a system configuration to be


If you have added some sales orders to be simulated, the application attempts to
download details about those sales orders. You'll be prompted, confirm with 'Yes'
Increase number of connections by adjusting the "Threads:" spin control.
Start slowly.
Maximum is somewhere around 95 per instance of the tool. If you need
more parallel connections you have to run the tool multiple times, better from
multiple machines in parallel.

The Tool will start to establish connections to remote SAP system. There is
slight delay needed for log on.

If you decrease the number, the connections will not be closed

immediately, but as soon as the running task finishes.
There is a idle time between each transaction/RFC call. This delay can be control
with the "Delay" spin control. It's suggested for sales order simulation to keep it reasonably
long. On some systems, there are CO (Controlling) reports running asynchronously. This
delay allows those jobs not to overlap and thus exhaust system resources.
"SO para" spin control (SO simulation parallelism) specifies, how many sales order
simulation jobs can run in parallel. If you have configured limited number of sales orders,
it's advised to set "SO para" low. In general the best is to configure as many sales orders as
possible to have versatility in the simulation. Then you may increase the "SO para"
Use "Keep user session open" check box to control session management




Checked = once logged on the session is kept open. Every transaction is

run like: log on => invoke function => wait delay time => invoke function2 => wait delay
time => ....

Unchecked = Each session is terminated right after the function is finished.

Every transaction is run like: log on => invoke function => log off => wait delay time =>
log on => invoke function2 => log off ...
Bottom part of the window reports, how many of each RFC calls has been called
and what's been the average response time for last 20 calls (floating average)
On the right side you see memory consumption of the tool and current number of
connections (threads).
In the application folder you'll find 3 log files after each run log*.txt. Those contain
errors, transaction times and memory profiles. Do not upload the error log file to a
public place as it contains your user password!!
I appreciate your comments or requests, what shall be the next blog part .... I did not
covered yet the design, test results, if you want to see some code snippets ....


I do not indent to open source it.

217 Views 0 Comments PermalinkTags: test, interoperability_.net, .net, .net_connector, .net_conn
ector_client, stress_testing, 3.0, stress

SAP Stress Test with .NET - Part I - Intro

Posted by Hynek Petrak Nov 28, 2013
How to use the tool is covered here: http://scn.sap.com/community/interoperabilitymicrosoft-net/blog/2013/12/04/sap-stress-test-with-net--part-iii

Part I: The challenge

I used to be a developer (programmer) in industrial automation. For many years. Better to
say I used to work as developer for many years. Some things you cannot simply forget, like
how to swim ...
I do not work anymore as developer. Since some years.
I work now for a global company. We have a lot of SAP instances around the globe. Some
are small, some with several thousands of users. Since many years.
Time to time, you need to refresh the hardware, sometimes there are good reasons to
change your server operating system, another time you do an SAP upgrade with or without
Unicode conversion, sometimes you go virtual. Usually you have very short time for cut
over (resp. business downtime). Almost ever you cannot afford mistakes or failed Go Live.
Sometimes is more efficient to combine things and do a couple of changes at once. But the
more changes you do at one time the more riskier the project can be.

My father in law always used to say: "Who's worried stays behind." (in Czech language it's
a funny quote). He was 28 times national champion in Golf.
Few years ago we've done few upgrades from version 4.6 to ECC6. On one of those
upgrades we faced a slight trouble after Go Live. The system started to collapse in an rush
hour. Once per day, when a peak number of users entered the system. The system was low
on physical memory. We got a maintenance window to put more memory, but how to make
sure it was sufficient? A stress test? But how to make 700 users to connect to the system,
just for a testing purpose? Impossible.
That time I had some experience with SAP.Functions ActiveX scripting in VBScript. More
precisely, I knew, how to call RFC_READ_TABLE :-). We had some 1 hour buffer, so I wrote a
simple script, that called that RFC and prepared another batch script, that called the first
one many times in parallel.
Nice try, but a) RFC_READ_TABLE is not very representative example of "business
transaction" b) it is very memory hungry on the client side, so after some 20 connections to
the SAP the client PC memory was completely exhausted (including the virtual one) and
the PC become unusable for 15 minutes due to extensive swapping.
Three month ago I came across .NET connector. Roughly at the same time I got noticed,
that there are 3 other similar projects to be executed and whether it would be possible to
use my VBScript for stress testing. But really, that one was written in 1 hour and is not
something I would be proud of. I can do better! Or not? It's been many years since I'm not a
So I took it a bit as a personal challenge, whether a person near to 40, has forgotten, how
to swim...
You should know I end up that time with Visual Studio 6. .NET came just after, but I did not
So I took the latest Visual Studio 2012, .NET, C# looked very similar to C++ (I wrote
hundreds of thousand lines in c/c++). I made a "Hello World" in C# and what a surprise it
was like I would talk in my native language. Immediately I downloaded and installed
SAP .NET Connector to try to invoke my beloved RFC_READ_TABLE. But what a surprise,
the .NCO did not appear in the list of available references?? Nor anywhere in the menu to
be added as component to my project?? I spent 2 days looking around and then I gave up,
perhaps I got old and .NET world is not any more for me. Did I perhaps miss that train?
The other day I meat my colleague from Greece and express him my disillusion. He said,
come on, it's simple, you just add the NCO dll into your application folder and add the DLL
as references directly! What a hack ? I tried and it works!
Boosted by this success I wrote a SAP Stress Test tool in 4 days. But let's talk about the
design next time ....

345 Views 1 Comments PermalinkTags: test, interoperability_.net, .net, .net_connector, .net_conn

ector_client, stress_testing, 3.0, stress

SAP .NET Connector - RFC_READ_TABLE in 50

Posted by Hynek Petrak Oct 15, 2013
Find below a simple RFC_READ_TABLE call using the SAP .net connector 3.x
ReadTable("PR1", "MARA", "MATNR,MTART,MATKL", "MTART EQ 'HAWA'", rows);
where "PR1" - destination system
"MARA" - string, table to query
"MATNR,xxx" - string, coma separated column names. If empty "", then all table fields
are retrieved.
"MTART EQ 'HAWA'" - string, WHERE clause in ABAP SQL syntax
rows - output variable, list of strings to hold the result table rows. Columns delimited by
Current limitation: in the below version of the code, the filter can be only upto 72
characters long. If you need longer, I suggest to involve a word wrapping function like this
one: http://bryan.reynoldslive.com/post/Wrapping-string-data.aspx

public bool ReadTable(string dest, string table, string fields, string filter, out List<string> ro

ws) {

string[] field_names = fields.Split(",".ToCharArray());


RfcDestination destination = RfcDestinationManager.GetDestination(dest);


IRfcFunction readTable;


try {

readTable = destination.Repository.CreateFunction("BBP_RFC_READ_TABLE");


} catch (RfcBaseException ex) {

//Log.Error(String.Format("\nError in function module RFC_READ_TABLE ({0})", ex.


rows = null;


return false;



readTable.SetValue("query_table", table);


readTable.SetValue("delimiter", "~");


IRfcTable t = readTable.GetTable("DATA");




t = readTable.GetTable("FIELDS");




if (field_names.Length > 0) {




int i = 0;


foreach (string n in field_names) {


t.CurrentIndex = i++;


t.SetValue(0, n);



t = readTable.GetTable("OPTIONS");






t.CurrentIndex = 0;


t.SetValue(0, filter);


//Log.Debug(string.Format("SELECT {0} FROM {1} WHERE {2}",






t = readTable.GetTable("DATA");


int a = t.Count;


rows = new List<string>();


for (int i = 0; i < t.RowCount; i++) {

fields, table, filter));


t.CurrentIndex = i;





return true;


633 Views 9 Comments Permalink Tags: interoperability_.net, .net, connector, rfc_read_table

RFC_READ_TABLE data into MS Access (along

with the table structure)
Posted by Hynek Petrak Oct 15, 2013
A piece of code, that has been for sure posted many times - how to fetch SAP table data
with VBA from within an Excel or MS Access.
However this time the added value is that local MS Access table is created on the fly, based
on SAP table structure.
RFC_READ_TABLE(tableName, columnNames, filter, local_table_name)
where tableName - SAP table name to fetch
columnNames - string, coma separated column names to fetch. If empty string "", then
all table fields are retrieved.
filter - string, WHERE clause in the ABAP SQL syntax
local_table_name - string, local table name to be created
Example 1:
... will create local table MY_MAST, with given columns for all MAST records for plant XX10
Example 2:
... will create local table MY_MARD, with columns MATNR,LGORT,LGPBE, where LGORD is
either XX60 or XX61
Example 3:
RFC_READ_TABLE("MARD", "", "LGORT = 'XX60' or LGORT = 'XX61'", "MY_MARD2")

... will create local table MY_MARD2, with all SAP table columns, where LGORD is either
XX60 or XX61
Remark: I recommend to use BBP_RFC_READ_TABLE instead of RFC_READ_TABLE, as with
the plain RFC_READ_TABLE I had performance problems and crash dumps on large tables.


Public Function RFC_READ_TABLE(tableName, columnNames, filter, table_name)


Dim R3 As Object, MyFunc As Object, App As Object


' Define the objects to hold IMPORT parameters












' Where clause

As Object
As Object
As Object

10. Dim OPTIONS As Object

11. ' Fill with fields to return. After function call will hold
12. ' detailed information about the columns of data (start position
13. ' of each field, length, etc.
14. Dim FIELDS As Object
15. ' Holds the data returned by the function
16. Dim DATA

As Object

17. ' Use to write out results

18. Dim ROW As Object
19. Dim Result As Boolean
20. Dim i As Long, j As Long, iRow As Long
21. Dim iColumn As Long, iStart As Long, iStartRow As Long, iField As Long, iLength As Long
22. Dim outArray, vArray, vField
23. Dim iLine As Long

24. Dim noOfElements As Long

25. '**********************************************
26. 'Create Server object and Setup the connection
27. 'use same credentials as SAP GUI DLogin
28. On Error GoTo abend:
29. Set R3 = CreateObject("SAP.Functions")
30. ' Fill below logon details
31. R3.Connection.ApplicationServer = "x.x.x.x"

R3.Connection.SystemNumber = "00"


R3.Connection.System = "XX1"


R3.Connection.Client = "120"


R3.Connection.Password = "password"

36. R3.Connection.User = "user"

37. R3.Connection.Language = "EN"
38. If R3.Connection.Logon(0, True) <> True Then

RFC_READ_TABLE = "ERROR - Logon to SAP Failed"


Exit Function

41. End If
42. '**********************************************
43. '*****************************************************
44. 'Call RFC function RFC_READ_TABLE
45. '*****************************************************
46. Set MyFunc = R3.Add("BBP_RFC_READ_TABLE")

Set QUERY_TABLE = MyFunc.exports("QUERY_TABLE")


Set DELIMITER = MyFunc.exports("DELIMITER")


Set NO_DATA = MyFunc.exports("NO_DATA")


Set ROWSKIPS = MyFunc.exports("ROWSKIPS")


Set ROWCOUNT = MyFunc.exports("ROWCOUNT")


Set OPTIONS = MyFunc.tables("OPTIONS")


Set FIELDS = MyFunc.tables("FIELDS")


QUERY_TABLE.Value = tableName


DELIMITER.Value = ""


NO_DATA = ""








OPTIONS.Value(1, "TEXT") = filter ' where filter


vArray = Split(columnNames, ",") ' columns




For Each vField In vArray


If vField <> "" Then




FIELDS.Value(j, "FIELDNAME") = vField




End If




Result = MyFunc.Call


If Result = True Then


Set DATA = MyFunc.tables("DATA")


Set FIELDS = MyFunc.tables("FIELDS")


Set OPTIONS = MyFunc.tables("OPTIONS")








DLog "SAP RFC Error: " & MyFunc.EXCEPTION


Exit Function


End If

81. noOfElements = FIELDS.ROWCOUNT

82. iRow = 0
83. iColumn = 0
84. 'ReDim outArray(0 To DATA.ROWCOUNT, 0 To noOfElements - 1)
85. 'For Each ROW In FIELDS.Rows
86. ' outArray(iRow, iColumn) = ROW("FIELDNAME")
87. ' iColumn = iColumn + 1
88. 'Next
89. 'Display Contents of the table
90. '**************************************
91. iRow = 1
92. iColumn = 1
93. Dim l As String
94. Dim fipos
95. ReDim fipos(1 To FIELDS.ROWCOUNT, 1 To 3)
96. Dim db As DAO.Database
97. Set db = CurrentDb()
98. Dim sql As String
99. Dim r As String

On Error Resume Next


db.Execute "DROP TABLE " & table_name & ";"


If Err.Number <> 0 Then

DLog "DROP TABLE Error: " & Err.Description


End If


On Error GoTo abend:


sql = "CREATE TABLE " & table_name & " ("


Dim sql_ins As String, sql_ins_l As String


'sql_ins = "INSERT INTO " & table_name & " ("


For iColumn = 1 To FIELDS.ROWCOUNT


fipos(iColumn, 1) = FIELDS(iColumn, "OFFSET") + 1


fipos(iColumn, 2) = CInt(FIELDS(iColumn, "LENGTH"))


fipos(iColumn, 3) = FIELDS(iColumn, "FIELDNAME")




sql = sql & FIELDS(iColumn, "FIELDNAME") & " CHAR(" & fipos(iColumn, 2) & ")


'sql_ins = sql_ins & FIELDS(iColumn, "FIELDNAME") & ") VALUES ("




sql = sql & FIELDS(iColumn, "FIELDNAME") & " CHAR(" & fipos(iColumn, 2) & ")


'sql_ins = sql_ins & FIELDS(iColumn, "FIELDNAME") & ", "


End If




db.Execute sql


'DLog ("Saving " & DATA.ROWCOUNT & " records in local table " & table_name)


Dim rs As Recordset


Dim le As Long


Set rs = db.OpenRecordset(table_name, dbOpenTable, dbAppendOnly)




For iLine = 1 To DATA.ROWCOUNT


l = DATA(iLine, "WA")


'sql_ins_l = sql_ins


le = Len(l)




For iColumn = 1 To FIELDS.ROWCOUNT

If fipos(iColumn, 1) > le Then


'outArray(iRow, iColumn - 1) = Null


'sql_ins_l = sql_ins_l & "NULL"


GoTo skipme:


rs.FIELDS(fipos(iColumn, 3)) = Trim(Mid(l, fipos(iColumn, 1), fipos(iColumn, 2)


'outArray(iRow, iColumn - 1) = Mid(l, fipos(iColumn, 1), fipos(iColumn, 2))

)), "'", "''") & "'"

'sql_ins_l = sql_ins_l & "'" & Replace(Mid(l, fipos(iColumn, 1), fipos(iColumn, 2


End If


'If iColumn = FIELDS.ROWCOUNT Then


' sql_ins_l = sql_ins_l & ") "




' sql_ins_l = sql_ins_l & ", "


'End If








'db.Execute sql_ins_l








Exit Function




RFC_READ_TABLE = Err.Description


End Function

778 Views 0 Comments PermalinkTags: interoperability_.net, ms, vba, rfc_read_table, bbp_rfc_re

ad_table, acess

SAP .NET Connector - storing connection

details in SQL Server
Posted by Hynek Petrak Oct 14, 2013
I've seen many examples of IDestinationConfiguration implementations, where the
connection details are more or less hard-coded. Find below an example, where a MS SQL
Server table is used as storage for connection parameters:


public class SqlDestinationConfiguration : IDestinationConfiguration {


public RfcConfigParameters GetParameters(string name) {


SqlConnection con = new SqlConnection(

WebConfigurationManager.ConnectionStrings["MyConnectionString"].ToString()); // a
s defined in your [app|web].config file


RfcConfigParameters cp = new RfcConfigParameters();


SqlCommand cmd = new SqlCommand();


SqlDataReader rdr;


cmd.CommandText = "SELECT DISTINCT ApplicationServer, Client, " +


"SystemNumber, Language, " +


"UserName, encpwd, id, description, MessageServer, " +


"System, UseMsgServer, GroupName " +


"FROM [SAPSystems] Where [ID] = @id";


cmd.CommandType = CommandType.Text;


cmd.Connection = con;


dynamic param = new SqlParameter("id", SqlDbType.NVarChar, 10);


param.Value = name;






rdr = cmd.ExecuteReader(CommandBehavior.CloseConnection);


if (rdr.HasRows) {




string encp = rdr.GetString(5); // encrypted password


string ppw = null;


try {
// password encryption/decryption can be omitted


ppw = DecryptPassword(encp, name); // decrypt password with salt = name (custo

m function)

} catch (Exception ex) {


throw new Exception("Failed to decrypt the password: " + ex.Message);


return cp;



// 0-ApplicationServer, 1-Client, 2-SystemNumber, 3-Language,


// 4-UserName, 5-encpwd, 6-id, 7-description, 8-MessageServer,


// 9-System, 10-UseMsgServer, 11-GroupName, 12-Language


if (rdr.GetBoolean(10)) { // load balanced?


cp.Add(RfcConfigParameters.MessageServerHost, rdr.GetString(8));


cp.Add(RfcConfigParameters.LogonGroup, rdr.GetString(11));


} else {


cp.Add(RfcConfigParameters.AppServerHost, rdr.GetString(0));


cp.Add(RfcConfigParameters.SystemNumber, rdr.GetString(2));



cp.Add(RfcConfigParameters.Client, rdr.GetString(1));


cp.Add(RfcConfigParameters.SystemID, rdr.GetString(9));


cp.Add(RfcConfigParameters.User, rdr.GetString(4));


cp.Add(RfcConfigParameters.Password, ppw);


cp.Add(RfcConfigParameters.Codepage, "1100");


cp.Add(RfcConfigParameters.Language, rdr.GetString(3));


cp.Add(RfcConfigParameters.PeakConnectionsLimit, "101");



return cp;



public bool ChangeEventsSupported() {

return false; // can change any time asynchronously



public event RfcDestinationManager.ConfigurationChangeHandler ConfigurationChanged;


228 Views 0 Comments Permalink Tags: interoperability_.net, .net, connector, nco, sql, 3

Case Study: Upgrade SAP NCo 2.0 to 3.0

Posted by Jitesh Kumar Sinha Sep 7, 2013

Application based on .NET framework 1.1 using NCo 2.0 to be upgraded to .NET
framework 4.0 and NCo 3.0.

In this case study only the upgrade to NCo 3.0 part is covered.


To understand the constraints a little better I would first explain in brief the
structure of the application to be upgraded.
The web service application comprises of two parts (projects) one containing the
web services and other containing the old connector SAP RFC proxy classes and
types. The proxy types are being referenced from web services and removing all
the generated types and classes would have meant writing the entire application
from scratch.
There were 100s of web services which was invoking approximately 120 SAP RFC
to pull and push data from SAP system.
There were dozens of consumer applications based on different technologies
consuming these services and changing anything on consumer application side
was not being considered considering the development and deployment effort it
would have required.
These poses another challenge as the client proxies generated using Framework
1.1 web services was not fully compatible with Framework 4.0 web services as
SOAP standard is changed.

Source Application Architecture

Target Application Architecture

Solution Design
After initial deliberation it became quickly clear that we need to somehow automate the
process of generating proxy classes and types compatible to new NCo 3.0. There were
approximate 120 RFCs and writing codes to populate and invoke proxies would have taken
considerable amount of time.
The output was a .NET based utility which took all the existing classes and types and
converted to types and methods compatible with new NCo. During the runtime it
connected to SAP system to get the RFCs metadata and generated the codes required for
invoking RFCs through new connector.
The next issue was to map the .NET types to the types understandable by new connector
namely RFCTable, RFCStructure and other basic types. Couple of generic conversion
routines were written to convert .NET type Tables and structures to Connector s RFCTable
and RFCStructure types. These routines converted these types at runtime when the RFCs
were invoked.
Similarly we required other set of generic routines to convert the export parameters from
RFCTable and RFCStructure to .NET data types.
And at last to overcome the incompatibility issue in SOAP definitions we had to overwrite
the incoming SOAP request at web server side.

Keeping the implementation constraints, the performance improvement achieved was only
slightly better. However, in terms of technical upgrade it was significant achievement. No
change in client application and no consumer application roll out required. Only downtime
needed was for server deployment.
There is scope for further improvement if more framework 4.0 functionality is utilized and
conversion routines are removed.


There is an excellent blog on NCo 3.0 by Thomas Weiss A Spotlight on the New
.NET Connector 3.0 which I found very helpful.
430 Views 1 Comments PermalinkTags: interoperability_.net, .net_connector, .net_connector_clie
nt, nco, integration_architect, 3.0

Latest available NCo patch level: 3.0.13

Posted by Markus Tolksdorf May 8, 2013
This blog is intended to inform about the latest available patch level of NCo. If new ones
are shipped, also this blog post will be updated.

Patch level 3.0.13 of the SAP .NET Connector 3.0 has just been released. Please
see note 1921800 for a list of bug fixes and enhancements made with this patch
The SAP .NET Connector 3.0.13 can be downloaded from the SAP Service
Marketplace at
-> SAP Connector for Microsoft .NET
-> Download SAP Connector for Microsoft .NET Version 3.0
If you are lacking the required authorization for downloading software from the
SAP Service Marketplace, please follow the instructions of note 1037574 for
requesting this authorization.

625 Views 2 Comments Permalink Tags: interoperability_.net

Is it time to rethink the boundaries of your

business and IT?
Posted by Clinton Jones Feb 19, 2013

There is an increasing interest in facilitating the

provision of enterprise data on mobile device and companies with a BYOD policy are even
more energized around understanding how enterprise data can be safely delivered and
provisioned on devices like smart phones and tablets.

Mobile devices are of course nothing new, they are amongst the earliest of technologies
that connected to back end SAP and other corporate systems however in the past the
whole provisioning activity was very limited and tightly controlled and controllable. Pretty
much anyone with the requisite URL, user name and password can potentially access
almost any piece of centrally stored data from anywhere in the connected world.

22% use SharePoint in a limited way

Osterman Research conducted a survey in early May 2012 in which 986 survey
respondents in a combination of small, mid-sized and large organizations were polled. 79%
of organizations have personal smartphones connecting in some way to corporate
resources. 71% have personal tablet devices connecting. Only 48% have formal policy in
place. Many cloud based applications and storage repositories in particular are used
without IT blessing. These present a particular data vulnerability since as much as 30% of
mobile data is being synched using 3rd party products. 68% of the respondents are using
SharePoint and 22% are using it on a departmental only basis. The most common use of
SharePoint was collaboration (87%) and 65% for forms and workflow. Only 12% 13% use
SharePoint via a smartphone or a tablet. This all suggests relatively low focus on
SharePoint as an obvious target for BYOD support and yet it is an area that Microsoft is
making major development effort around in SharePoint 2013.
Although this trend is of concern to businesses wary of unauthorized access to sensitive
information, there are some interesting options from Winshuttle that enable the provision
of enterprise data to mobile devices, remote users and extranet partners without providing
all access to all areas. This idea of a unchained enterprise data furthermore, does not
necessarily mean that the you have to consider cloud technology if your business is not yet
ready to make a switch to cloud.
In the world of apps it is relatively easy to expose very cleanly defined data models in high
constrained combinations because the app performs a very specific function. This makes
complete sense if your role or job requires you to only perform some very rudimentary
tasks but how does this tran slate when you have a number of different things that you
want to do, from making a data inquiry, to changing a piece of data on the fly and then
updating the backend system accordingly. Interestingly in the Osterman survey 63% of
respondents said that they would prefer mobile users to access SharePoint using a client
application instead of a browser.
Some of the leading customer relationship solutions in the market support both an
application based, and a browser based way for you perform such tasks but in many
respects simply having mobile device enabled websites are considered an old fashioned
approach and there is an increasing drive for users to specify the total technology that they
would prefer to use to get to and manipulate certain data. Websites, even the mobile
device optimized ones, suffer from what Jakob Nielsen, a usability guru, refers to as readtap asymmetry; you may be able to read the data but navigation is difficult.

SharePoint with unimaginitive use cases

In Ostermans research the most common mobile related SharePoint activities were found
to be:
Access documents and files while offline (85%)
Email links instead of attachments (64%)
Editing documents/syncing changes to SharePoint (62%)
Creating calendar items (59%)
Accessing SAP data via SharePoint as facilitated by solution platforms
like Winshuttle or Microsoft Duet is for example a common requirement enterprise-wide but
not necessarily from a mobile device per se. One of the reasons for this may be the level
of effort real or perceived, around creating connectivity between mobile devices and the
SAP back end.
The (BYOT) Bring Your Own Technology permissive approach by IT departments supports
those who prefer to use mac books or windows devices as alternatives to corporate
technology standards as well as those who prefer tablets or smartphones as their primary
device for working with corporate systems. Supporting all of these possible combinations
effectively with a unified platform can be technologically challenging for even the most
energetic of IT departments and IT infrastructure groups.

It could be argued that the rise of BYO has come about due to failure of corporate IT to
meet the demands of the user community. Often the choice to go with an own device is
driven by frustration with having to labour under the draconian IT policies imposed on
corporate issued technology. A lack of administrator rights to install software, disabled USB
ports, and a geriatric operating system are all signs that IT is applying best practice in
controlling technology but at the same time missing the point that some users actually
function best when their technology is unchained. There are numerous examples of
corporate IT policies overshadowing corporate user efficiency and effectiveness by blocking
or disabling certain functions and user practices but these are not worth restating because
they are pretty well understood.

Compliance bothers 50% of technologists

The TEC 2012 BYOD , Big Data, Compliance & Migrations survey revealed an interesting
Insight from the 119 survey participants. Major compliance issues were ranked as :

Appropriate Access Rights (50%)

Appropriate Staffing for compliance handling (44%)

Managing compliance of devices (41%)

The mainframe mindset around desktop computing has to be undone and it has to be
accepted that many employees are often far more technology literate than the IT
department is prepared to give them credit for.
Avoiding the tide of BYO becoming an unsupportable Tsunami needs some different
thinking and SharePoint may prove to be a saviour in this space. Having a policy for BYO is
inevitable and many of the technology providers that help It organizations lock down
desktops now offer user centric management approaches for users own devices. Any BYO
program has a tough tightrope to walk between being practically effective as well as
reasonably supportable.
Some suggested things to consider around your policy:

Not all employees should be considered equal in terms of a BYO policy - how will
you decide?

What technologies will you really support?

What can they access?

Once you have determined your policies and tools consider the enabling technologies you
will use to distribute corporate data in a secure and reliable way. SharePoint 2013 may
pave the way for even more exciting and resilient ways for your user community to interact
from their own devices.

Additional Reading

TEC 2012 BYOD , Big Data, Compliance & Migrations survey

Osterman Research - Putting IT Back in Control of BYOD

297 Views 0 Comments PermalinkTags: sap, erp, mobile, interoperability_.net, .net_connector, .n
et_connector_client, sharepoint, unchained, microsoft_duet,winshuttle

NCo 3: Fun with BAPIs, part 1

Posted by Ed Hammerbeck Feb 14, 2013
In my previous post, I discussed ways of using the .NET connector (NCo 3) to connect your
client application with your R/3 back end. Let's assume you've done that. Now what?

NCo 3 is designed to enable you to work with BAPIs. SAP delivers hundreds of these fancy
function modules for us to use. They are handy for everything from getting a list of
employee benefit elections out of HCM to creating a cash journal document in FI. In fact, by
calling various BAPIs you could whip together powerful applications in your .NET
environment, leveraging SAP's own code for the hard stuff. Myself, I've been thinking about
a custom open enrollment application that leverages the available benefits-related BAPIs in
But let's discuss how you call a BAPI using NCo 3. Once you have a destination established,
all BAPIs and RFC-enabled function modules in that SAP system become available to you.
Just call it by name, and you get an instance of the BAPI in your .NET client app with all the
import, export, changing, and tables parameters -- all appropriately typed, all ready to use.
Here's how you do it. First, you need a variable to hold the BAPI.
Dim getBalances As IRfcFunction
module + fancy stuff

' BAPI = function

Now, you call the CreateFunction method of the repository object, passing it the name of
the BAPI you want. The repository object is a sub-object of the RfcDestination object, and
you know all about RfcDestination objects from my previous post.
getBalances =
Now, I know from studying this BAPI in SE37 that it has some simple field parameters, a
structure (the RETURN parameter, where success and error messages come from,) and a
table parameter. This particular BAPI works by passing it a GL account, fiscal year, and
some other simple parameters, and in return, you get a table of account balances broken
out by period as well as the fiscal year opening balance. So, I need to set some input
parameters before I can do anything. Let's do it this way.
With getBalances
.SetValue("COMPANYCODE", "1234")
.SetValue("CURRENCYTYPE", "10")
.SetValue("GLACCT", "00000" & strGLAccount)
.SetValue("FISCALYEAR", strFiscalYear)
So, with the SetValue method, we identify which parameter we are setting, and giving it a
value. Think of it like a key-value pair where the key is the SAP name of the parameter. One
thing to remember about some of these parameters is that within SAP, often, conversion
programs run automatically that, for instance, pad some fields with leading zeroes. In the
case of the GL account, SAP stores that 10-character field with leading zeroes, and so I
have make sure the value I pass is padded out to a length of 10 characters before I pass it
to my BAPI. (I admit, my example isn't very elegant.) If you find the BAPI doesn't like what
you are passing to it, make sure something like this isn't going on, and you are passing
what the BAPI expects.
Once we pass in our input parameters, we call the RfcFunction's Invoke method. The Invoke
method takes your destination object as its parameter. Behind the scenes, this is where
NCo 3 logs into SAP, executes your BAPI with the parameters you provided, and returns
whatever it is supposed to. How do you retrieve this output? The RfcFunction object comes
with several Get____ methods, including GetTable, GetStructure, GetString, etc.
Dim tblAccountBalances As IRfcTable
Dim strucReturn As IRfcStructure
getBalances.Invoke(myDest) ' <---- where the magic
tblAccountBalances = getBalances.GetTable("ACCOUNT_BALANCES")
strBalanceCarryForward = getBalances.GetString("BALANCE_CARRIED_FORWARD")
strucReturn = getBalances.GetStructure("RETURN")
In the code above, after invoking the BAPI, I simply get the output by calling their SAP
name. Grabbing values from simple fields is easy with methods like GetString or
GetDecimal. GetTable returns an IRfcTable type. The IRfcTable type is fairly intuitive to use
and behaves a bit like a .NET DataTable, and the IRfcStructure sort of behaves like a one-

row table. That's a gross oversimplification, however. Both objects have their peculiarities,
and in my next post, I will discuss some of the things I learned about using them
926 Views 2 Comments PermalinkTags: beginner, interoperability_.net, .net_connector, .net_conn
ector_client, bapi

Connecting to SAP with NCo 3

Posted by Ed Hammerbeck Feb 14, 2013
The SAP .NET Connector version 3 (NCo 3) is a vast improvement over previous versions.
Within a couple hours of downloading it, I had whipped together a nifty, little prototype
client application consuming BAPIs from several different application areas.
In the next few blog posts, I will discuss some of the tricks I discovered while messing
around with this powerful tool. If you are just getting started with this tool, I hope these
posts will help you over some features that weren't so well documented. One of the major
improvements in version 3 simplified the act of connecting to the R/3 back end, and that
will be the subject of this post.
First off, some advice. SAP was kind enough to make a tutorial application available for
download. Download it! Try to run it. Debug it. Study it closely. It gives working code
examples of most of the major features of NCo 3. I found it a big help when
the documentation and programming guide didn't exactly make sense. There's nothing like
studying code examples. I will refer to the tutorial application from time to time in these
The simplest, if least desirable, method of configuring your RfcDestination object is to hard
code your logon parameters using the RfcConfigParameters object. Thomas Weiss provides
a good example of this in his excellent blog. You can also put these parameters in the
app.config file. This is well demonstrated in the tutorial application. A method I figured out
through trial-and-error, since the documentation on this didn't make much sense to me,
was to leverage your SAP GUI's saplogon.ini file. This, of course, assumes your client
application is going to be running on a machine with the SAP GUI installed.
First, you must create an instance of the SapLogonIniConfiguration object and register it as
your destination configuration. As if by magic, this is all it takes for the tool to find your
saplogon.ini file. These two simple lines of code took me quite a while to figure out.
Dim mySapLogon As SapLogonIniConfiguration = SapLogonIniConfiguration.Create

Once you do this, you can get the logon parameters for the SAP instance you
want to use by specifying its name. In this case, I want to log into our sandbox
Dim myparm As RfcConfigParameters = mySapLogon.GetParameters("ECCSBX")
If you run your code, and stop it just after this line, you'll see that myparm is filled with
parameter data. All that came from your saplogon.ini. The next thing you need to do is set
your desired SAP instance as an RfcDestination, and then from that, you must create a
custom destination. I'm not sure why you can't do this in one step, but you can't.
Dim newDest As RfcDestination =
Dim newCustomDest As RfcCustomDestination = newDest.CreateCustomDestination
Finally, you must pass in the missing parameters, such as user name, password, and client,
to the custom destination. You could get this from the user in a dialog box, pull it from a
database, from the app.config file, or hard code it as in my example below. It's up to you.
newCustomDest.User = "batman"
newCustomDest.Password = "jokersux"
newCustomDest.Client = "110"
user makes

' don't hard code this stuff

' ... seriously, don't
' client could be a selection the

Now, your destination object is fully configured from saplogon.ini information in eight, easy
lines of code. Now, you are ready to take this destination object, start grabbing BAPIs from
the repository, and do amazing things with SAP data. I will discuss how I did that in my next
1978 Views 12 Comments PermalinkTags: beginner, interoperability_.net, .net_connector, .net_co

Binary incompatibility in NCo 3.0.9

Posted by Markus Tolksdorf Nov 9, 2012
Unfortunately, with NCo 3.0.9, a binary incompatibility has been introduced. See also SAP
note 1782675. Therefore make sure not to use 3.0.9 as basis for your projects as later
patch levels will break your application, if you don't have the possibility to recompile.
Instead, download the latest .NET Connector patch level from
https.//service.sap.com/connectors -> SAP Connector for Microsoft .NET -> Download SAP
Connector for Microsoft .NET Version 3.0. (currently 3.0.10)
246 Views 0 Comments Permalink Tags: interoperability_.net

Should you really worry about whats in the

secret sauce?
Posted by Clinton Jones May 10, 2012
Perhaps it speaks more to my personality than anything else, but I hate being asked, how
do you do that, or what is in there? The worst are the product label scrutineers! I am
talking about culinary masterpieces in the kitchen, of course, but then you knew that didnt
you ? I know my saucepans from my frying pans and I know the temperature at which to
bake a sponge cake but I am afraid that when you move from baking to cooking youre in
the world of artists rather than the world of chemistry. Some chefs and master bakers
might disagree.
So too, we come to the world of providing technology to the business. Sometimes we
shouldnt have anyone worry about the ingredients that went into the secret sauce of the
delivered solution, Maybe the fact that it is built out of .NET components is important, but
is it overridinh? After-all, it doesnt seem so long ago that we frowned upon the use of
personal devices on the corporate network like that Palm O/S device while everyone else
was lugging around mobile phones the size of packets of cream cheese. Ok it wasnt quite
that bad but you get my point. I am talking here about the way that youre getting that
data in and out of your SAP system.
There are so many different ways of accessing your SAP system today. Youd be pretty hard
pressed to find any single way to do everything and although the promise of simplification
in architecture is out there, it will take some time before this is fully established. In the
mean-time, business goes on, orders need to be placed and fulfilled and the wheels of
commerce need to keep grinding on. There are probably methods and mechanisms that
are in daily use, and have been established for some time, that some might raise their
eyebrows at or frown upon. The point would be though, why get all bent out of shape over
the specifics of the technology if it is addressing your business need consistently and
reliably without compromising your system security. If youre discovering flaws in the way
your systems have been architected then instead of raising alarm bells without any clear
strategy for remediation, rather spend that energy on working on the next generation of
solutions that will address the business requirements.
Ive noticed a trend particularly with rambunctious enterprise architects whove suddenly
been included in design discussions about the enterprise stack, making comments like,
we have made it a policy not to use technology yzx or worse, saying things like what if
SAP changes directions and kills off .... These types of posturing and melodramatic
postulating about future technologies are important. Theyre important when your business
precariously teeters on the brink of making a decision to completely change their business

model based on technology (moving from bricks and mortar to online selling), decides to
renew their manufacturing infrastructure (switch from custom built M.E.S. to an off the
shelf product like WonderWare or modifies the staffing model around a particular
technology stack (offshoring call centers because VoIP is supported) but it rarely plays
such a major role when youre talking about bolting improved functionality or efficiency
engines onto your existing ERP system. Such technologies, like Business Objects for
analytics, Crystal Reports for report writing, Syclo for mobile or Winshuttle for process
automation and UI replacement are all disparate technologies. They have similarities and
major differences. They all rely on SAP as the system of record and they all bring
substantial business benefits for resolving very specific business problems today.
We need to get out of the mode of thinking about technology built to last with an indefinite
life and start thinking about technology that solves the problems that we have today
especially when youre dealing with .NET technology. Ten years ago there was scarcely
such a thing as a smartphone, SharePoint was still an emerging technology and yet today
both are pretty ubiquitous. In another ten years there will be something different. A given
SAP installations dependency on an Oracle database today may suddenly be supplanted
by a HANA RDB and then your enterprise commitment today to the big O may appear to
have been flawed.
575 Views 0 Comments PermalinkTags: interoperability_.net, enterprise_architecture, .net, .net_c
onnector, .net_connector_client, .net_connector_server,integration_architect

2 approaches to architecting SAP / SharePoint

Posted by William van Strien Nov 4, 2010
In todays market the interest is growing to integrate the structural business processing of
SAP within the familiar workplace environment of the Information Worker. These enterprise
workplaces are in more and more companies provided by means of SharePoint 2007/2010.
To achieve the SAP / SharePoint integration on structural and future-proof manner, requires
investigation at forehand to come up with a solid interoperability architecture.

Main steps to define a solid SAP / Microsoft.NET

interoperability architecture

Derive and define guiding principles; originating as first from the business
perspective, and next from IT. Typically the latter are more of constrictive nature; e.g.
required to apply a service architecture, required to conform to W3*-standards, ...

Analyze the current state of the IT landscapes (IST) within the company: SAP and
Microsoft environments and server products.


Define and describe the interoperability architecture

Conceptual level; on purpose technology and product agnostic, to make it
more timeless and future-proof
Concrete level; with a choice for interoperability technologies and products
that are nowadays available
Validate the interoperability architecture by means of either a Proof-of-Concept, or
via a small launching interoperability project.

Adjust the defined interoperability architecture on the lessons learned.

Guiding architecture principles

Layered architecture, with separation of concerns. A typical layer architecture is:





Service architecture

Loosely coupled

SAP business backend is and remains responsible for the correctness of business

Responsibility of business data consistency within the SAP backend layer

Approaches to derive the interoperability architecture for a

concrete case
When in context of delivering a concrete application, there are basically 2 approaches you
can apply to derive the layered interoperability architecture.

Outside-In (nb: in Microsoft terminology, this approach is also referred to with the
phrase Contract-First)

As the name already suggests, the starting point here is your current SAP landscape state.
Start with identifying the available functional building blocks in the SAP environment
existing BAPI Function Modules, RFCs, SAP workflow business objects, and already
available SAP webservices. And expose these to outside world, to have the related SAP
data and functionality consumed by a non-SAP front-end.
Biggest advantage of this approach is that you have a faster time2market. You can base
your SAP / Microsoft.NET interoperability on already existing, and thus also tested, SAP
building blocks. The only thing that needs to be done is to put a (web)service interface on
them, and then you can integrate with the Microsoft based presentation layer.
This can be summarized with the phrase Garbage In Garbage Out. If you base your
interoperability architecture on the current state of your SAP environment, there is a large
likelihood that SAP-proprietary concepts will be visible on the integration surface level. In
general, an architecture that originates via this approach will be less pure and transparent.
And thus less future-proof.

The essence of the Outside-In approach is to first agree on and define the conceptual
contract [interface] between the service provider side [SAP], and the service consumer side
[SharePoint]. The idea is to start from the requested application functionalities. Derive and
define at a conceptual level the services you require from the SAP backend to deliver the
application and system functionalities. Describe the service interfaces in W3*-standards
based notation and data structures. From here on, map onto required SAP building blocks:
existing if available, new ones otherwise. At SharePoint / presentation side, you can build
the consumption layer for the defined service interfaces via wsdl.
Thus start at outside with the conceptual, externally visible service interfaces; and
continue then for both provider and consumer / SAP and Microsoft sides to the inside with
their respective technologies.
Because you start in this approach with a green field, the resulting interoperability
architecture typical has a cleaner interface, which inherently conforms to interoperability
standards. And because you start from the required business services, it also has a better
chance on being conceptual correct, and future-proof.
Biggest disadvantage is that this approach requires more investment and time at front in
deriving and describing the service interface layer. Also it requires to get both SAP and
Microsoft departments representatives on par, to have a common understanding of the
applied service concepts. And in case of existing SAP building blocks, a transformation can
be required to map the standards-based service interface to the SAP-specifics Function
Modules, RFCs, workflow business objects.
257 Views 0 Comments Permalink Tags: interoperability_.net

Login to follow, like, comment, share and bookmark content.

Login Register

Filter Blog
By author:
Clinton Jones
Ed Hammerbeck
Hynek Petrak
Jitesh Kumar Sinha
Markus Tolksdorf
Ramesh Vodela
Venkata Reddy Vatrapu
William van Strien
By date:

By tag:



Follow SCN
Site Index
Contact Us
SAP Help Portal
Terms of Use
Legal Disclosure

Woman buys new car just so she can ram it into store she disliked

University student faces charges for damaging $5 cheese


Pizza delivery man seen on video successfully throwing pizza boxes...


You might enjoy reading: