Вы находитесь на странице: 1из 2722

Directory: /usr/sap/SMP/DVEBMGS00/work

Name: dev_disp
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
--------------

---------------------------------------------------
trc file: "dev_disp.new", trc level: 1, release: "749"
---------------------------------------------------
sysno 00
sid SMP
systemid 390 (AMD/Intel x86_64 with Linux)
relno 7490
patchlevel 0
patchno 700
intno 20160201
make multithreaded, Unicode, 64 bit, optimized
profile /usr/sap/SMP/SYS/profile/SMP_DVEBMGS00_smprd02
pid 32089

Sat Sep 14 06:39:35:319 2019


kernel runs with dp version 24000(ext=119000) (@(#) DPLIB-INT-VERSION-0+24000-UC)
length of sys_adm_ext is 504 bytes
***LOG Q00=> DpSapEnvInit, DPStart (00 32089) [dpInit.c 683]
*** WARNING => DpCheckParams: invalid value for SETENV_04 =
PATH=/usr/sap/SMP/DVEBMGS00/exe:/usr/sap/SMP/DVEBMGS00/exe:/sapdb/clients/SDB/bin:/
sapdb/clients/SDB/pgm:/sapdb/programs/pgm:/sapdb/programs/bin#:/usr/sap/SMP/hdbclie
nt:/sybase/SMP/ASE-16_0/jobscheduler/bin:/sybase/SMP/ASE-16_0/bin:/sybase/SMP/ASE-
16_0/install:/sybase/SMP/OCS-
16_0/bin:/sapdb/clients/SDB/bin:/sapdb/clients/SDB/pgm:/sapdb/programs/pgm:/sapdb/p
rograms/bin#:/usr/sap/SMP/hdbclient:/home/smpadm/bin:/usr/local/bin:/bin:/usr/bin:/
usr/bin/X
*** WARNING => DpCheckParams: check message: Parameter value not matching with
regular expression '.+=.*' [dpInit.c 4068]
DwSLCheckVersion: shared lib "dw_xml.so" version 700, compatibility level 700, SAP
release 7490 successfully loaded
DwSLCheckVersion: shared lib "dw_xtc.so" version 700, compatibility level 700, SAP
release 7490 successfully loaded
DwSLCheckVersion: shared lib "dw_stl.so" version 700, compatibility level 700, SAP
release 7490 successfully loaded
DwSLCheckVersion: shared lib "dw_gui.so" version 700, compatibility level 700, SAP
release 7490 successfully loaded
DwSLCheckVersion: shared lib "dw_rndrt.so" version 700, compatibility level 700,
SAP release 7490 successfully loaded
DwSLCheckVersion: shared lib "dw_abp.so" version 700, compatibility level 700, SAP
release 7490 successfully loaded
rdisp/softcancel_sequence : -> 0,5,-5
use internal message server connection to port 3901
rdisp/shutdown/disable_login : 0
rdisp/show_dispatcher_info : 1
DpCommonParamInit: rdisp/core_file_size = default --> no change

Sat Sep 14 06:39:35:425 2019


MtxInit: DISP 0 0
DpIPCInit2: write dp-profile-values into sys_adm_ext
RqQInit: use events to trigger worker
DpIPCInit2: start server >smprd02_SMP_00 <
DpShMCreate: alloate/attach shared memory (mode=CREATE)
DpShMCreate: sizeof(wp_adm) 23352 (1112)
DpShMCreate: sizeof(tm_adm) 53213160 (RDISPTERM=53160,MODEINFO=3040,
IMODE_INFO=80)
DpShMCreate: sizeof(ca_adm) 432000 (72)
DpCommTableSize: max/headSize/ftSize/tableSize=1500/8/5688056/5979112
DpShMCreate: sizeof(comm_adm) 5979112 (3784)
DpSlockTableSize: max/headSize/ftSize/fiSize/tableSize=0/0/0/0/0
DpShMCreate: sizeof(slock_adm) 0 (296)
DpFileTableSize: max/headSize/ftSize/tableSize=0/0/0/0
DpShMCreate: sizeof(file_adm) 0 (80)
DpSockTableSize: max/headSize/ftSize/tableSize=1500/8/1536056/1536064
DpShMCreate: sizeof(sock_adm) 1536064 (1016)
DpShMCreate: sizeof(vmc_adm) 0 (3224)
DpShMCreate: sizeof(wall_adm) (ft=200056/fi=205048/hd=64/rec=192)
DpShMCreate: sizeof(amc_rec_adm) (ft=1360056/fi=377048/hd=64/rec=672)
DpShMCreate: sizeof(comp_msg_adm) (ft=88056/fi=119048/hd=96/rec=168)
DpShMCreate: sizeof(websocket_adm) (ft=328056/hd=80/rec=648)
DpShMCreate: sizeof(gw_adm) 56
DpShMCreate: sizeof(j2ee_adm) 3928
DpShMCreate: SHM_DP_ADM_KEY (addr: 7f0a49a65000, size: 63972096
DpShMCreate: allocated sys_adm at 7f0a49a65200
DpShMCreate: allocated wp_adm_list at 7f0a49a77280
DpShMCreate: allocated wp_adm at 7f0a49a775a0
DpShMCreate: allocated tm_adm_list at 7f0a49a7d2d8
DpShMCreate: allocated tm_adm at 7f0a49a7d558
DpShMCreate: allocated ca_adm at 7f0a4cd3cf40
DpShMCreate: allocated comm_adm at 7f0a4cda68c0
DpShMCreate: system runs without slock table
DpShMCreate: allocated sock_adm at 7f0a4d35a6a8
DpShMCreate: allocated vmc_adm_list at 7f0a4d4d18e8
DpShMCreate: system runs without VMC
DpShMCreate: allocated gw_adm at 7f0a4d4d1b88
DpShMCreate: allocated j2ee_adm at 7f0a4d4d1dc0
DpShMCreate: allocated ca_info at 7f0a4d4d2f18
DpShMCreate: allocated wall_adm (ft) at 7f0a4d4d3158
DpShMCreate: allocated wall_adm (fi) at 7f0a4d5040d0

Sat Sep 14 06:39:35:441 2019


DpShMCreate: allocated wall_adm (head) at 7f0a4d5363c8
DpShMCreate: allocated amc_rec_adm (ft) at 7f0a4d536608
DpShMCreate: allocated amc_rec_adm (fi) at 7f0a4d6828c0
DpShMCreate: allocated amc_rec_adm (head) at 7f0a4d6deb98
DpShMCreate: allocated comp_msg_adm (ft) at 7f0a4d6dedd8
DpShMCreate: allocated comp_msg_adm (fi) at 7f0a4d6f47d0
DpShMCreate: allocated comp_msg_adm (head) at 7f0a4d711ad8
DpShMCreate: allocated websocket_adm (ft) at 7f0a4d711d38
DpShMCreate: allocated websocket_adm (head) at 7f0a4d7620b0
DpShMCreate: initialized 24 eyes
DpSysAdmIntInit: initialize sys_adm
DpSysAdmIntInit: created queue 0 (DispatcherQueue)
DpSysAdmIntInit: created queue 1 (GatewayQueue)
DpSysAdmIntInit: created queue 2 (IcmanQueue)
DpSysAdmIntInit: created queue 3 (StartServiceQueue)
DpSysAdmIntInit: created queue 4 (DpMonQueue)
Scheduler info
--------------
WP info
#dia = 8
#btc = 0
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
Sat Sep 14 06:39:35:442 2019
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

DpCommAttachTable: attached comm table


(header=7f0a4cda68c0/ft=7f0a4cda68c8/fi=7f0a4d3133c0)
DpSockAttachTable: attached sock table (header=7f0a4d35a6a8/ft=7f0a4d35a6b0)
MBUF state OFF
DpCommInitTable: init table for 1500 entries
DpSockInitTable: init table for 1500 entries
EM: Initializing PROC global storage: quota=0 use_stdheap=0
EM: Initializing PROC local storage: quota=0 use_stdheap=0
EmInit: MmSetImplementation( 2 ).
MM global diagnostic options set: 0
EsSetImplementation: Using implementation std
<ES> Info: use mapped file
<ES> Info: use normal pages (no huge table support available)
EmSetReserve: set EM reserve to 112 EM blocks
<ES> client 21 initializing ....
EsSetImplementation: Using implementation std
<ES> EsILock: use Mutex for locking
<ES> InitFreeList
<ES> Info: disclaim threshold = 0 MB
<ES> Info: disclaim coasting/alloc = 300 s
<ES> Info: disclaim coasting/free = 0 s
<ES> Info: blockdisclaimsize_KB = 0 KB
<ES> Info: TimeMeasurementActive = 0
EsStdInit: 4096 MB successfully allocated, 18428 MB left
EsStdInit: 4096 MB successfully allocated, 14332 MB left
EsStdInit: 4096 MB successfully allocated, 10236 MB left
EsStdInit: 4096 MB successfully allocated, 6140 MB left
EsStdInit: 4096 MB successfully allocated, 2044 MB left
EsStdInit: 2044 MB successfully allocated, 0 MB left
EsStdInit: Extended Memory 22524 MB allocated
Linux: Kernel supports shared memory disclaiming
Linux: using madvise(<pointer>, <size>, 9).
<ES> 5630 blocks reserved for free list.
ES initialized.
EgInit: EG initialized with 4120M in 2 segments
mm.dump: set global maximum dump mem to 192 MB

Sat Sep 14 06:39:35:504 2019


EsRegisterEmCheck: Register EmGetEsHandles at 1a9f98d
rdisp/calculateLoadAverage : 1
rdisp/snapshot(CREATE): DP_CS_RESOURCES_EXHAUSTED
rdisp/snapshot(PERIOD):300 sec
rdisp/snapshot(LINES):10000 lines
rdisp/snapshot(C-STACK):C-STACK = YES
=================================================
= SSL Initialization platform tag=(linuxx86_64_gcc43)
= (749_STACK patchno 700,Apr 10 2019,mt,ascii-uc, 16/64/64)
= Initialization with _no_ default credentials
= resulting Filename = "/usr/sap/SMP/DVEBMGS00/exe/libsapcrypto.so"
= disabled FIPS 140-2 crypto kernel
= found CommonCryptoLib 8.5.26 (Feb 22 2019) [AES-NI,CLMUL,SSE3,SSSE3]
= current UserID: "smpadm", env-var USER="smpadm"
= found SECUDIR environment variable
= using SECUDIR=/usr/sap/SMP/DVEBMGS00/sec
= [ctc] ssl/ciphersuites=HIGH:MEDIUM:+e3DES:!aNULL
= [dpf] ssl/client_ciphersuites=918:PFS:HIGH::EC_P256:EC_HIGH
= creating Envvar SAPSSL_CLIENT_CIPHERSUITES=918:PFS:HIGH::EC_P256:EC_HIGH
= Success -- SapCryptoLib SSL ready!
=================================================

ssfPkiInitSAPCryptolib:
SsfSupInitEx("/usr/sap/SMP/DVEBMGS00/exe/libsapcrypto.so")==0 (SSF_SUP_OK)
found CommonCryptoLib 8.5.26 (Feb 22 2019) [AES-NI,CLMUL,SSE3,SSSE3]
secure communication to message server is switched on
***LOG Q0K=> DpMsAttach, mscon ( smprd02) [dpMessageSer 1902]
DpStartStopMsg: send start message (myname is >smprd02_SMP_00
<)
DpStartStopMsg: start msg sent to message server o.k.
MBUF state LOADING
DpCheckStopStateAfterMsAttach: server supports AMC msg header version 2
MPI: dynamic quotas disabled.
MPI init, created: pipes=4000 buffers=2718 reserved=815 quota=10%, buffer
size=65536, total size MB=170
DpAsCreate: forked gwrd (pid 32117)
DpIcmCreate: forked ICM (pid 32118)
DpWpDynCreate: created new work process W0-32119
DpWpDynCreate: created new work process W1-32120
DpWpDynCreate: created new work process W2-32121
DpWpDynCreate: created new work process W3-32122
DpWpDynCreate: created new work process W4-32123
DpWpDynCreate: created new work process W5-32124

Sat Sep 14 06:39:35:557 2019


DpWpDynCreate: created new work process W6-32125
DpWpDynCreate: created new work process W7-32126
DpWpDynCreate: created new work process W8-32127
DpWpDynCreate: created new work process W9-32128
DpWpDynCreate: created new work process W10-32129
DpWpDynCreate: created new work process W11-32130
DpWpDynCreate: created new work process W12-32131
DpWpDynCreate: created new work process W13-32132
DpWpDynCreate: created new work process W14-32133
DpWpDynCreate: created new work process W15-32134
**START Linux Memory Parameter Check
virtual memory: hard-limit = UNLIMITED
virtual memory: soft-limit = UNLIMITED
core size: hard-limit = UNLIMITED
core size: soft-limit = UNLIMITED
data segment size: hard-limit = UNLIMITED
data segment size: soft-limit = UNLIMITED
stack size: hard-limit = UNLIMITED
stack size: soft-limit = 8 MB
max open files: hard-limit = 65536
max open files: soft-limit = 65536
Page Size: 4 KB
kernel.shmmax = 8796093022207 MB
kernel.shmall = 4503599627370495 MB
**END Linux Memory Parameter Check
Linux scheduler "SCHED_OTHER" used with prio 0

Sat Sep 14 06:39:35:785 2019


CCMS: SemInMgt: Initializing Semaphore Management in AlAttachShm_Doublestack.
CCMS: SemInit: Semaphore 38 initialized by AlAttachShm_Doublestack.
CCMS: start to initalize 3.X shared alert area (first segment).
DpCheckPreemptionTicker: created thread for DpPremptionTicker
DpMBufHwIdSet: set Hardware-ID
***LOG Q1C=> DpMBufHwIdSet [dpxxmbuf.c 1215]
MBUF state ACTIVE
DpMBufTypeMod: MBUF component UP (event=MBUF_DPEVT_UP)
DpMsgProcess: 1 server in MBUF
DpMsCheckServices()
DpMsgProcess: MBUF state is MBUF_ACTIVE
DpMBufReset: state = MBUF_PREPARED

Sat Sep 14 06:39:41:480 2019


MBUF state ACTIVE
DpMBufTypeMod: MBUF component UP (event=MBUF_DPEVT_UP)
DpMsgProcess: 1 server in MBUF
DpMsCheckServices()

Sat Sep 14 06:39:41:711 2019


DpMBufTypeMod: MBUF server state ACTIVE (event=MBUF_DPEVT_DSTATE)
DpModState: change server state from STARTING to ACTIVE
DpMsInfo: MOD for server smprd02_SMP_00

Sat Sep 14 06:40:07:576 2019


DpSendLoadInfo: quota for load / queue fill level = 7.200000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 7.406501 /
0.300000

Sat Sep 14 06:40:15:568 2019


AdAdjustBufferForLongMessages: maximal number of AD records is 10000, scratch len =
1040036 bytes

Sat Sep 14 06:40:28:328 2019


DpWpDynCreate: created new work process W16-404
Sat Sep 14 06:40:28:590 2019
DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 14 06:40:31:342 2019


DpWpDynCreate: created new work process W17-432

Sat Sep 14 06:40:42:601 2019


DpSendLoadInfo: quota for load / queue fill level = 9.000000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 9.051612 /
0.307143

Sat Sep 14 06:40:48:496 2019


DpWpDynCreate: created new work process W18-572

Sat Sep 14 06:40:48:605 2019


DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 14 06:40:50:930 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 11->8, VB 1->1, ENQ 0->0, BTC 5->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5

Sat Sep 14 06:40:51:135 2019


DpSendLoadInfo: quota for load / queue fill level = 9.900000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 9.925944 /
0.250000

Sat Sep 14 06:40:51:893 2019


DpWpConf: wp reconfiguration, stop W4, pid 32123
DpAdaptWppriv_max_no : 3 -> 6
DpWpDynCreate: created new work process W19-589

Sat Sep 14 06:40:52:613 2019


DpSendLoadInfo: queue DIA no longer with high load
DpHdlDeadWp: W4 (pid=32123) terminated automatically

Sat Sep 14 06:40:53:614 2019


DpSendLoadInfo: quota for load / queue fill level = 9.900000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 10.045682 /
0.257143

Sat Sep 14 06:40:54:959 2019


DpWpDynCreate: created new work process W20-762

Sat Sep 14 06:40:55:255 2019


DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 14 06:41:00:609 2019


DpWpConf: wp reconfiguration, stop W17, pid 432
DpAdaptWppriv_max_no : 6 -> 7

Sat Sep 14 06:41:00:754 2019


DpHdlDeadWp: W17 (pid=432) terminated automatically

Sat Sep 14 06:41:01:619 2019


DpSendLoadInfo: quota for load / queue fill level = 9.900000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 10.329930 /
0.278571
Sat Sep 14 06:41:14:959 2019
DpWpConf: wp reconfiguration, stop W7, pid 32126
DpAdaptWppriv_max_no : 7 -> 6

Sat Sep 14 06:41:15:148 2019


DpHdlDeadWp: W7 (pid=32126) terminated automatically

Sat Sep 14 06:41:26:500 2019


DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 14 06:45:54:983 2019


DpHdlDeadWp: W19 (pid=589) terminated automatically

Sat Sep 14 06:45:55:579 2019


DpWpCheck: dyn W20, pid 762 no longer needed, terminate now

Sat Sep 14 06:45:56:057 2019


DpHdlDeadWp: W20 (pid=762) terminated automatically

Sat Sep 14 09:17:15:865 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T30_U4242 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T30_U4242_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |07:34:49|16 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T30_U4242_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Sat Sep 14 09:36:56:592 2019


DpHdlSoftCancel: cancel request for T23_U12799_M0 received from DISP
(reason=DP_SOFTCANCEL_AD_MSG)

Sat Sep 14 09:37:49:583 2019


DpWpDynCreate: created new work process W4-30151

Sat Sep 14 09:38:37:503 2019


DpHdlDeadWp: W10 (pid=32129) terminated automatically
DpWpDynCreate: created new work process W10-30822

Sat Sep 14 09:42:55:906 2019


DpWpCheck: dyn W4, pid 30151 no longer needed, terminate now

Sat Sep 14 09:42:56:331 2019


DpHdlDeadWp: W4 (pid=30151) terminated automatically

Sat Sep 14 09:43:15:914 2019


DpWpDynCreate: created new work process W17-2751

Sat Sep 14 09:47:59:444 2019


DpWpDynCreate: created new work process W7-4947

Sat Sep 14 09:48:05:862 2019


DpWpDynCreate: created new work process W19-4958

Sat Sep 14 09:48:16:594 2019


DpHdlDeadWp: W17 (pid=2751) terminated automatically

Sat Sep 14 09:49:15:591 2019


DpWpDynCreate: created new work process W20-5636

Sat Sep 14 09:53:00:258 2019


DpHdlDeadWp: W7 (pid=4947) terminated automatically

Sat Sep 14 09:53:15:922 2019


DpWpCheck: dyn W19, pid 4958 no longer needed, terminate now

Sat Sep 14 09:53:16:932 2019


DpHdlDeadWp: W19 (pid=4958) terminated automatically

Sat Sep 14 09:54:19:341 2019


DpHdlDeadWp: W20 (pid=5636) terminated automatically

Sat Sep 14 09:58:14:596 2019


DpWpDynCreate: created new work process W4-9111

Sat Sep 14 10:03:15:935 2019


DpWpCheck: dyn W4, pid 9111 no longer needed, terminate now

Sat Sep 14 10:03:16:885 2019


DpHdlDeadWp: W4 (pid=9111) terminated automatically

Sat Sep 14 10:08:26:657 2019


DpHdlDeadWp: W13 (pid=32132) terminated automatically
DpWpDynCreate: created new work process W13-5982

Sat Sep 14 10:08:35:427 2019


DpWpDynCreate: created new work process W17-6623

Sat Sep 14 10:09:37:596 2019


DpWpDynCreate: created new work process W7-9688
DpWpDynCreate: created new work process W19-9689

Sat Sep 14 10:13:55:959 2019


DpWpCheck: dyn W17, pid 6623 no longer needed, terminate now

Sat Sep 14 10:13:56:903 2019


DpHdlDeadWp: W17 (pid=6623) terminated automatically

Sat Sep 14 10:14:45:154 2019


DpHdlDeadWp: W7 (pid=9688) terminated automatically
DpWpCheck: dyn W19, pid 9689 no longer needed, terminate now

Sat Sep 14 10:14:46:258 2019


DpHdlDeadWp: W19 (pid=9689) terminated automatically

Sat Sep 14 10:16:01:774 2019


DpWpDynCreate: created new work process W20-11822

Sat Sep 14 10:21:15:973 2019


DpWpCheck: dyn W20, pid 11822 no longer needed, terminate now

Sat Sep 14 10:21:16:667 2019


DpHdlDeadWp: W20 (pid=11822) terminated automatically

Sat Sep 14 10:26:58:261 2019


DpWpDynCreate: created new work process W4-16059
Sat Sep 14 10:32:15:989 2019
DpWpCheck: dyn W4, pid 16059 no longer needed, terminate now

Sat Sep 14 10:32:16:485 2019


DpHdlDeadWp: W4 (pid=16059) terminated automatically

Sat Sep 14 10:37:19:381 2019


DpWpDynCreate: created new work process W17-19954

Sat Sep 14 10:42:36:010 2019


DpWpCheck: dyn W17, pid 19954 no longer needed, terminate now

Sat Sep 14 10:42:36:940 2019


DpHdlDeadWp: W17 (pid=19954) terminated automatically

Sat Sep 14 10:42:56:011 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T17_U20656 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T17_U20656_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |10:01:20|1 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T17_U20656_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Sat Sep 14 10:45:02:779 2019


DpWpDynCreate: created new work process W7-22557

Sat Sep 14 10:45:57:227 2019


DpWpDynCreate: created new work process W19-22983

Sat Sep 14 10:48:04:556 2019


DpWpDynCreate: created new work process W20-24761

Sat Sep 14 10:50:03:710 2019


DpHdlDeadWp: W7 (pid=22557) terminated automatically

Sat Sep 14 10:51:01:386 2019


DpHdlDeadWp: W19 (pid=22983) terminated automatically

Sat Sep 14 10:53:32:464 2019


DpHdlDeadWp: W20 (pid=24761) terminated automatically

Sat Sep 14 11:01:55:351 2019


DpHdlSoftCancel: cancel request for T14_U381_M0 received from DISP
(reason=DP_SOFTCANCEL_AD_MSG)
DpHdlSoftCancel: cancel request for T11_U898_M0 received from GATEWAY
(reason=DP_SOFTCANCEL_ABORT_PROGRAM)
DpHdlSoftCancel: cancel request for T2_U867_M0 received from GATEWAY
(reason=DP_SOFTCANCEL_ABORT_PROGRAM)

Sat Sep 14 11:01:55:723 2019


DpHdlDeadWp: W12 (pid=32131) terminated automatically
DpWpDynCreate: created new work process W12-32601

Sat Sep 14 11:04:20:393 2019


DpHdlSoftCancel: cancel request for T17_U5189_M0 received from DISP
(reason=DP_SOFTCANCEL_AD_MSG)
DpHdlSoftCancel: cancel request for T24_U5207_M0 received from GATEWAY
(reason=DP_SOFTCANCEL_ABORT_PROGRAM)
DpHdlSoftCancel: cancel request for T2_U5204_M0 received from GATEWAY
(reason=DP_SOFTCANCEL_ABORT_PROGRAM)

Sat Sep 14 11:08:50:261 2019


DpSendLoadInfo: quota for load / queue fill level = 7.200000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 7.250171 /
0.050000

Sat Sep 14 11:08:51:252 2019


DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 14 11:09:20:829 2019


DpSendLoadInfo: quota for load / queue fill level = 7.200000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 7.239106 /
0.028571

Sat Sep 14 11:09:21:114 2019


DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 14 11:10:03:203 2019


DpSendLoadInfo: quota for load / queue fill level = 7.200000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 7.254876 /
0.250000

Sat Sep 14 11:10:21:385 2019


DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 14 11:10:25:115 2019


DpHdlDeadWp: W3 (pid=32122) terminated automatically
DpWpDynCreate: created new work process W3-19909

Sat Sep 14 11:13:27:715 2019


DpWpDynCreate: created new work process W4-28510

Sat Sep 14 11:16:57:121 2019


DpWpDynCreate: created new work process W17-543

Sat Sep 14 11:18:36:078 2019


DpWpCheck: dyn W4, pid 28510 no longer needed, terminate now

Sat Sep 14 11:18:36:430 2019


DpHdlDeadWp: W4 (pid=28510) terminated automatically

Sat Sep 14 11:22:06:722 2019


DpHdlDeadWp: W17 (pid=543) terminated automatically

Sat Sep 14 11:29:45:713 2019


DpWpDynCreate: created new work process W7-8071

Sat Sep 14 11:34:56:109 2019


DpWpCheck: dyn W7, pid 8071 no longer needed, terminate now

Sat Sep 14 11:34:56:256 2019


DpHdlDeadWp: W7 (pid=8071) terminated automatically

Sat Sep 14 11:40:22:679 2019


DpWpDynCreate: created new work process W19-11961

Sat Sep 14 11:44:02:752 2019


DpHdlDeadWp: W10 (pid=30822) terminated automatically
DpWpDynCreate: created new work process W10-13291

Sat Sep 14 11:45:28:889 2019


DpHdlDeadWp: W19 (pid=11961) terminated automatically

Sat Sep 14 11:55:27:384 2019


DpWpDynCreate: created new work process W20-18192

Sat Sep 14 11:58:24:435 2019


DpWpDynCreate: created new work process W4-19649

Sat Sep 14 11:58:24:711 2019


DpWpDynCreate: created new work process W17-19650

Sat Sep 14 12:00:36:178 2019


DpWpCheck: dyn W20, pid 18192 no longer needed, terminate now

Sat Sep 14 12:00:37:736 2019


DpHdlDeadWp: W20 (pid=18192) terminated automatically

Sat Sep 14 12:01:00:501 2019


DpHdlDeadWp: W12 (pid=32601) terminated automatically
DpWpDynCreate: created new work process W12-21159

Sat Sep 14 12:01:44:846 2019


DpHdlDeadWp: W10 (pid=13291) terminated automatically
DpWpDynCreate: created new work process W10-21958

Sat Sep 14 12:01:52:491 2019


DpWpDynCreate: created new work process W7-22199

Sat Sep 14 12:03:26:575 2019


DpHdlDeadWp: W17 (pid=19650) terminated automatically

Sat Sep 14 12:03:28:995 2019


DpHdlDeadWp: W4 (pid=19649) terminated automatically

Sat Sep 14 12:05:39:053 2019


DpWpDynCreate: created new work process W19-29225

Sat Sep 14 12:06:56:215 2019


DpWpCheck: dyn W7, pid 22199 no longer needed, terminate now

Sat Sep 14 12:06:57:233 2019


DpHdlDeadWp: W7 (pid=22199) terminated automatically

Sat Sep 14 12:08:20:988 2019


DpWpDynCreate: created new work process W20-5917

Sat Sep 14 12:09:02:033 2019


DpWpDynCreate: created new work process W17-6338

Sat Sep 14 12:10:56:522 2019


DpHdlDeadWp: W19 (pid=29225) terminated automatically

Sat Sep 14 12:13:36:249 2019


DpWpCheck: dyn W20, pid 5917 no longer needed, terminate now
Sat Sep 14 12:13:36:736 2019
DpHdlDeadWp: W20 (pid=5917) terminated automatically

Sat Sep 14 12:14:16:250 2019


DpWpCheck: dyn W17, pid 6338 no longer needed, terminate now

Sat Sep 14 12:14:16:766 2019


DpHdlDeadWp: W17 (pid=6338) terminated automatically

Sat Sep 14 12:15:58:039 2019


DpWpDynCreate: created new work process W4-22210

Sat Sep 14 12:18:01:881 2019


DpWpDynCreate: created new work process W7-23331

Sat Sep 14 12:18:15:202 2019


DpUpdateStatusFileWith: state=YELLOW, reason=Length of high priority queue exceeds
limit

Sat Sep 14 12:21:16:269 2019


DpWpCheck: dyn W4, pid 22210 no longer needed, terminate now

Sat Sep 14 12:21:17:203 2019


DpHdlDeadWp: W4 (pid=22210) terminated automatically

Sat Sep 14 12:23:16:273 2019


DpWpCheck: dyn W7, pid 23331 no longer needed, terminate now
DpHdlDeadWp: W7 (pid=23331) terminated automatically

Sat Sep 14 12:24:58:285 2019


DpWpDynCreate: created new work process W19-26668

Sat Sep 14 12:28:14:308 2019


DpWpDynCreate: created new work process W20-27633

Sat Sep 14 12:30:00:875 2019


DpHdlDeadWp: W19 (pid=26668) terminated automatically

Sat Sep 14 12:30:03:845 2019


DpWpDynCreate: created new work process W17-28780

Sat Sep 14 12:33:16:289 2019


DpWpCheck: dyn W20, pid 27633 no longer needed, terminate now

Sat Sep 14 12:33:18:404 2019


DpHdlDeadWp: W20 (pid=27633) terminated automatically

Sat Sep 14 12:33:56:507 2019


DpWpDynCreate: created new work process W4-30316

Sat Sep 14 12:35:15:503 2019


DpHdlDeadWp: W17 (pid=28780) terminated automatically

Sat Sep 14 12:39:16:297 2019


DpWpCheck: dyn W4, pid 30316 no longer needed, terminate now

Sat Sep 14 12:39:16:773 2019


DpHdlDeadWp: W4 (pid=30316) terminated automatically
Sat Sep 14 12:40:03:496 2019
DpWpDynCreate: created new work process W7-803

Sat Sep 14 12:45:16:055 2019


DpHdlDeadWp: W7 (pid=803) terminated automatically

Sat Sep 14 12:47:58:784 2019


DpWpDynCreate: created new work process W19-3746

Sat Sep 14 12:48:23:110 2019


DpWpDynCreate: created new work process W20-3878

Sat Sep 14 12:52:59:538 2019


DpHdlDeadWp: W19 (pid=3746) terminated automatically

Sat Sep 14 12:53:04:861 2019


DpWpDynCreate: created new work process W17-5771

Sat Sep 14 12:53:36:322 2019


DpWpCheck: dyn W20, pid 3878 no longer needed, terminate now

Sat Sep 14 12:53:36:642 2019


DpHdlDeadWp: W20 (pid=3878) terminated automatically

Sat Sep 14 12:58:09:317 2019


DpHdlDeadWp: W17 (pid=5771) terminated automatically

Sat Sep 14 13:00:32:111 2019


DpHdlDeadWp: W10 (pid=21958) terminated automatically
DpWpDynCreate: created new work process W10-8818

Sat Sep 14 13:08:15:672 2019


DpWpDynCreate: created new work process W4-6486

Sat Sep 14 13:08:58:637 2019


DpWpDynCreate: created new work process W7-6964

Sat Sep 14 13:13:18:500 2019


DpHdlDeadWp: W4 (pid=6486) terminated automatically

Sat Sep 14 13:13:58:914 2019


DpWpDynCreate: created new work process W19-8763

Sat Sep 14 13:13:59:723 2019


DpHdlDeadWp: W7 (pid=6964) terminated automatically

Sat Sep 14 13:19:04:031 2019


DpHdlDeadWp: W19 (pid=8763) terminated automatically

Sat Sep 14 13:23:58:577 2019


DpWpDynCreate: created new work process W20-12253

Sat Sep 14 13:28:13:915 2019


DpWpDynCreate: created new work process W17-13835

Sat Sep 14 13:29:16:386 2019


DpWpCheck: dyn W20, pid 12253 no longer needed, terminate now

Sat Sep 14 13:29:16:584 2019


DpHdlDeadWp: W20 (pid=12253) terminated automatically

Sat Sep 14 13:30:57:272 2019


DpWpDynCreate: created new work process W4-14565

Sat Sep 14 13:33:15:050 2019


DpHdlDeadWp: W17 (pid=13835) terminated automatically

Sat Sep 14 13:36:16:397 2019


DpWpCheck: dyn W4, pid 14565 no longer needed, terminate now

Sat Sep 14 13:36:17:443 2019


DpHdlDeadWp: W4 (pid=14565) terminated automatically

Sat Sep 14 13:38:26:571 2019


DpWpDynCreate: created new work process W7-17109

Sat Sep 14 13:38:59:841 2019


DpWpDynCreate: created new work process W19-17200

Sat Sep 14 13:40:56:404 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T37_U10717 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T37_U10717_M3 |001|SOLMAN_ADMIN|SST-LAP-LEN0028 |13:19:47|2 |
RSICFTREE |high| |
|SICF |
DpHdlSoftCancel: cancel request for T37_U10717_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)
DpHdlSoftCancel: cancel request for T37_U10717_M1 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)
DpHdlSoftCancel: cancel request for T37_U10717_M3 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)
DpHdlSoftCancel: cancel request for T37_U10717_M4 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)
DpHdlSoftCancel: cancel request for T37_U10717_M5 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Sat Sep 14 13:43:36:408 2019


DpWpCheck: dyn W7, pid 17109 no longer needed, terminate now

Sat Sep 14 13:43:36:851 2019


DpHdlDeadWp: W7 (pid=17109) terminated automatically

Sat Sep 14 13:44:00:304 2019


DpHdlDeadWp: W19 (pid=17200) terminated automatically

Sat Sep 14 13:44:58:152 2019


DpWpDynCreate: created new work process W20-19154

Sat Sep 14 13:48:23:309 2019


DpWpDynCreate: created new work process W17-20463

Sat Sep 14 13:49:59:334 2019


DpHdlDeadWp: W20 (pid=19154) terminated automatically

Sat Sep 14 13:52:57:854 2019


DpWpDynCreate: created new work process W4-23223

Sat Sep 14 13:52:59:172 2019


DpWpDynCreate: created new work process W7-23226

Sat Sep 14 13:53:10:348 2019


DpWpDynCreate: created new work process W19-23236

Sat Sep 14 13:53:36:429 2019


DpWpCheck: dyn W17, pid 20463 no longer needed, terminate now

Sat Sep 14 13:53:37:557 2019


DpHdlDeadWp: W17 (pid=20463) terminated automatically

Sat Sep 14 13:57:59:308 2019


DpHdlDeadWp: W11 (pid=32130) terminated automatically
DpWpDynCreate: created new work process W11-25215

Sat Sep 14 13:57:59:858 2019


DpHdlDeadWp: W4 (pid=23223) terminated automatically

Sat Sep 14 13:58:11:412 2019


DpWpCheck: dyn W7, pid 23226 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=23236) terminated automatically

Sat Sep 14 13:58:11:876 2019


DpHdlDeadWp: W7 (pid=23226) terminated automatically

Sat Sep 14 14:02:18:884 2019


DpWpDynCreate: created new work process W20-31078

Sat Sep 14 14:07:36:454 2019


DpWpCheck: dyn W20, pid 31078 no longer needed, terminate now

Sat Sep 14 14:07:37:397 2019


DpHdlDeadWp: W20 (pid=31078) terminated automatically

Sat Sep 14 14:08:16:109 2019


DpWpDynCreate: created new work process W17-19581

Sat Sep 14 14:08:58:704 2019


DpWpDynCreate: created new work process W4-22843

Sat Sep 14 14:13:36:464 2019


DpWpCheck: dyn W17, pid 19581 no longer needed, terminate now

Sat Sep 14 14:13:36:733 2019


DpHdlDeadWp: W17 (pid=19581) terminated automatically

Sat Sep 14 14:14:00:435 2019


DpHdlDeadWp: W4 (pid=22843) terminated automatically

Sat Sep 14 14:16:54:086 2019


DpWpDynCreate: created new work process W19-27333

Sat Sep 14 14:17:54:097 2019


DpWpDynCreate: created new work process W7-27844

Sat Sep 14 14:21:56:479 2019


DpWpCheck: dyn W19, pid 27333 no longer needed, terminate now

Sat Sep 14 14:21:57:106 2019


DpHdlDeadWp: W19 (pid=27333) terminated automatically

Sat Sep 14 14:22:56:674 2019


DpHdlDeadWp: W7 (pid=27844) terminated automatically

Sat Sep 14 14:23:58:982 2019


DpWpDynCreate: created new work process W20-30146

Sat Sep 14 14:28:04:968 2019


DpWpDynCreate: created new work process W17-31356

Sat Sep 14 14:29:01:685 2019


DpHdlDeadWp: W20 (pid=30146) terminated automatically

Sat Sep 14 14:33:16:695 2019


DpWpCheck: dyn W17, pid 31356 no longer needed, terminate now

Sat Sep 14 14:33:17:070 2019


DpHdlDeadWp: W17 (pid=31356) terminated automatically

Sat Sep 14 14:42:13:939 2019


DpWpDynCreate: created new work process W4-3892

Sat Sep 14 14:47:16:719 2019


DpWpCheck: dyn W4, pid 3892 no longer needed, terminate now

Sat Sep 14 14:47:17:785 2019


DpHdlDeadWp: W4 (pid=3892) terminated automatically

Sat Sep 14 14:48:22:092 2019


DpWpDynCreate: created new work process W19-6024

Sat Sep 14 14:51:56:725 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for T0_U2797
(60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T0_U2797_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |14:10:23|0 |
SAPLBTCH |high| |
|SM37 |
DpHdlSoftCancel: cancel request for T0_U2797_M1 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Sat Sep 14 14:53:07:065 2019


DpWpDynCreate: created new work process W7-7867

Sat Sep 14 14:53:36:728 2019


DpWpCheck: dyn W19, pid 6024 no longer needed, terminate now

Sat Sep 14 14:53:37:170 2019


DpHdlDeadWp: W19 (pid=6024) terminated automatically

Sat Sep 14 14:53:58:811 2019


DpWpDynCreate: created new work process W20-8058

Sat Sep 14 14:58:08:466 2019


DpHdlDeadWp: W7 (pid=7867) terminated automatically

Sat Sep 14 14:59:16:737 2019


DpWpCheck: dyn W20, pid 8058 no longer needed, terminate now
Sat Sep 14 14:59:17:607 2019
DpHdlDeadWp: W20 (pid=8058) terminated automatically

Sat Sep 14 15:05:59:169 2019


DpWpDynCreate: created new work process W17-27844

Sat Sep 14 15:08:14:859 2019


DpWpDynCreate: created new work process W4-5818

Sat Sep 14 15:11:00:440 2019


DpHdlDeadWp: W17 (pid=27844) terminated automatically

Sat Sep 14 15:13:16:160 2019


DpHdlDeadWp: W4 (pid=5818) terminated automatically

Sat Sep 14 15:23:58:725 2019


DpWpDynCreate: created new work process W19-13251

Sat Sep 14 15:28:13:413 2019


DpWpDynCreate: created new work process W7-14371

Sat Sep 14 15:28:13:516 2019


DpWpDynCreate: created new work process W20-14372

Sat Sep 14 15:28:59:776 2019


DpHdlDeadWp: W19 (pid=13251) terminated automatically

Sat Sep 14 15:33:14:839 2019


DpHdlDeadWp: W7 (pid=14371) terminated automatically

Sat Sep 14 15:33:15:248 2019


DpHdlDeadWp: W20 (pid=14372) terminated automatically

Sat Sep 14 15:33:58:714 2019


DpWpDynCreate: created new work process W17-16393

Sat Sep 14 15:38:22:262 2019


DpWpDynCreate: created new work process W4-17609

Sat Sep 14 15:38:59:891 2019


DpHdlDeadWp: W17 (pid=16393) terminated automatically

Sat Sep 14 15:43:07:598 2019


DpWpDynCreate: created new work process W19-19383

Sat Sep 14 15:43:26:418 2019


DpHdlDeadWp: W4 (pid=17609) terminated automatically

Sat Sep 14 15:48:08:767 2019


DpHdlDeadWp: W19 (pid=19383) terminated automatically

Sat Sep 14 15:48:19:384 2019


DpWpDynCreate: created new work process W7-21114

Sat Sep 14 15:53:29:113 2019


DpHdlDeadWp: W7 (pid=21114) terminated automatically

Sat Sep 14 15:58:58:816 2019


DpWpDynCreate: created new work process W20-24993
Sat Sep 14 16:04:00:624 2019
DpHdlDeadWp: W20 (pid=24993) terminated automatically

Sat Sep 14 16:08:15:481 2019


DpWpDynCreate: created new work process W17-22753

Sat Sep 14 16:13:16:366 2019


DpHdlDeadWp: W17 (pid=22753) terminated automatically

Sat Sep 14 16:13:17:581 2019


DpWpDynCreate: created new work process W4-25200

Sat Sep 14 16:18:36:877 2019


DpWpCheck: dyn W4, pid 25200 no longer needed, terminate now

Sat Sep 14 16:18:37:338 2019


DpHdlDeadWp: W4 (pid=25200) terminated automatically

Sat Sep 14 16:23:07:605 2019


DpWpDynCreate: created new work process W19-28683

Sat Sep 14 16:23:09:101 2019


DpWpDynCreate: created new work process W7-28704

Sat Sep 14 16:28:08:358 2019


DpHdlDeadWp: W19 (pid=28683) terminated automatically

Sat Sep 14 16:28:11:018 2019


DpHdlDeadWp: W7 (pid=28704) terminated automatically

Sat Sep 14 16:28:13:454 2019


DpWpDynCreate: created new work process W20-30378

Sat Sep 14 16:28:58:743 2019


DpWpDynCreate: created new work process W17-30637

Sat Sep 14 16:33:14:464 2019


DpHdlDeadWp: W20 (pid=30378) terminated automatically

Sat Sep 14 16:33:58:791 2019


DpWpDynCreate: created new work process W4-32115

Sat Sep 14 16:34:00:717 2019


DpHdlDeadWp: W17 (pid=30637) terminated automatically

Sat Sep 14 16:37:59:340 2019


DpWpDynCreate: created new work process W19-1317

Sat Sep 14 16:38:59:689 2019


DpHdlDeadWp: W4 (pid=32115) terminated automatically

Sat Sep 14 16:43:00:864 2019


DpHdlDeadWp: W19 (pid=1317) terminated automatically

Sat Sep 14 16:48:18:343 2019


DpWpDynCreate: created new work process W7-4824

Sat Sep 14 16:48:58:830 2019


DpWpDynCreate: created new work process W20-5091

Sat Sep 14 16:53:36:941 2019


DpWpCheck: dyn W7, pid 4824 no longer needed, terminate now

Sat Sep 14 16:53:37:125 2019


DpHdlDeadWp: W7 (pid=4824) terminated automatically

Sat Sep 14 16:53:58:989 2019


DpWpDynCreate: created new work process W17-6694

Sat Sep 14 16:54:16:942 2019


DpWpCheck: dyn W20, pid 5091 no longer needed, terminate now

Sat Sep 14 16:54:17:130 2019


DpHdlDeadWp: W20 (pid=5091) terminated automatically

Sat Sep 14 16:58:59:853 2019


DpHdlDeadWp: W17 (pid=6694) terminated automatically

Sat Sep 14 17:05:03:352 2019


DpWpDynCreate: created new work process W4-24037

Sat Sep 14 17:08:14:873 2019


DpWpDynCreate: created new work process W19-6583

Sat Sep 14 17:08:56:778 2019


DpWpDynCreate: created new work process W7-7205

Sat Sep 14 17:10:16:511 2019


DpHdlDeadWp: W4 (pid=24037) terminated automatically

Sat Sep 14 17:13:16:346 2019


DpHdlDeadWp: W19 (pid=6583) terminated automatically

Sat Sep 14 17:14:08:871 2019


DpHdlDeadWp: W7 (pid=7205) terminated automatically

Sat Sep 14 17:18:17:089 2019


DpWpDynCreate: created new work process W20-10431

Sat Sep 14 17:21:07:409 2019


DpHdlDeadWp: W11 (pid=25215) terminated automatically
DpWpDynCreate: created new work process W11-11454

Sat Sep 14 17:22:41:461 2019


DpHdlDeadWp: W9 (pid=32128) terminated automatically
DpWpDynCreate: created new work process W9-11881

Sat Sep 14 17:22:53:074 2019


DpWpDynCreate: created new work process W17-11937

Sat Sep 14 17:23:36:993 2019


DpWpCheck: dyn W20, pid 10431 no longer needed, terminate now

Sat Sep 14 17:23:37:443 2019


DpHdlDeadWp: W20 (pid=10431) terminated automatically

Sat Sep 14 17:23:57:672 2019


DpWpDynCreate: created new work process W4-12357

Sat Sep 14 17:27:57:141 2019


DpHdlDeadWp: W17 (pid=11937) terminated automatically

Sat Sep 14 17:28:14:421 2019


DpWpDynCreate: created new work process W19-13691
DpWpDynCreate: created new work process W7-13692

Sat Sep 14 17:28:59:382 2019


DpHdlDeadWp: W4 (pid=12357) terminated automatically

Sat Sep 14 17:33:06:951 2019


DpWpDynCreate: created new work process W20-15277

Sat Sep 14 17:33:17:152 2019


DpWpCheck: dyn W7, pid 13692 no longer needed, terminate now
DpWpCheck: dyn W19, pid 13691 no longer needed, terminate now

Sat Sep 14 17:33:17:539 2019


DpHdlDeadWp: W7 (pid=13692) terminated automatically
DpHdlDeadWp: W19 (pid=13691) terminated automatically

Sat Sep 14 17:33:59:983 2019


DpWpDynCreate: created new work process W17-15468

Sat Sep 14 17:38:14:904 2019


DpHdlDeadWp: W20 (pid=15277) terminated automatically

Sat Sep 14 17:38:15:045 2019


DpWpDynCreate: created new work process W4-16845

Sat Sep 14 17:39:00:780 2019


DpHdlDeadWp: W17 (pid=15468) terminated automatically

Sat Sep 14 17:43:17:168 2019


DpWpCheck: dyn W4, pid 16845 no longer needed, terminate now

Sat Sep 14 17:43:17:637 2019


DpHdlDeadWp: W4 (pid=16845) terminated automatically

Sat Sep 14 17:44:00:778 2019


DpWpDynCreate: created new work process W7-18785

Sat Sep 14 17:44:01:208 2019


DpWpDynCreate: created new work process W19-18786

Sat Sep 14 17:49:17:179 2019


DpWpCheck: dyn W7, pid 18785 no longer needed, terminate now
DpWpCheck: dyn W19, pid 18786 no longer needed, terminate now

Sat Sep 14 17:49:17:463 2019


DpHdlDeadWp: W7 (pid=18785) terminated automatically

Sat Sep 14 17:49:18:860 2019


DpHdlDeadWp: W19 (pid=18786) terminated automatically

Sat Sep 14 17:50:09:959 2019


DpWpDynCreate: created new work process W20-20879
Sat Sep 14 17:55:17:761 2019
DpHdlDeadWp: W20 (pid=20879) terminated automatically

Sat Sep 14 17:56:02:710 2019


DpWpDynCreate: created new work process W17-22685

Sat Sep 14 18:01:17:214 2019


DpWpCheck: dyn W17, pid 22685 no longer needed, terminate now

Sat Sep 14 18:01:17:527 2019


DpHdlDeadWp: W17 (pid=22685) terminated automatically

Sat Sep 14 18:08:15:347 2019


DpWpDynCreate: created new work process W4-22307

Sat Sep 14 18:08:15:539 2019


DpWpDynCreate: created new work process W7-22308

Sat Sep 14 18:08:57:605 2019


DpWpDynCreate: created new work process W19-22562

Sat Sep 14 18:08:58:422 2019


DpWpDynCreate: created new work process W20-22566

Sat Sep 14 18:13:17:244 2019


DpWpCheck: dyn W4, pid 22307 no longer needed, terminate now
DpWpCheck: dyn W7, pid 22308 no longer needed, terminate now

Sat Sep 14 18:13:17:928 2019


DpHdlDeadWp: W7 (pid=22308) terminated automatically

Sat Sep 14 18:13:19:400 2019


DpHdlDeadWp: W4 (pid=22307) terminated automatically

Sat Sep 14 18:13:58:687 2019


DpHdlDeadWp: W19 (pid=22562) terminated automatically

Sat Sep 14 18:13:59:521 2019


DpHdlDeadWp: W20 (pid=22566) terminated automatically

Sat Sep 14 18:14:58:130 2019


DpWpDynCreate: created new work process W17-24974

Sat Sep 14 18:17:58:388 2019


DpWpDynCreate: created new work process W7-25828

Sat Sep 14 18:18:16:663 2019


DpWpDynCreate: created new work process W4-26024

Sat Sep 14 18:20:03:489 2019


DpHdlDeadWp: W17 (pid=24974) terminated automatically

Sat Sep 14 18:22:59:944 2019


DpHdlDeadWp: W7 (pid=25828) terminated automatically
DpWpDynCreate: created new work process W7-27637

Sat Sep 14 18:23:08:896 2019


DpWpDynCreate: created new work process W19-27642
Sat Sep 14 18:23:17:265 2019
DpWpCheck: dyn W4, pid 26024 no longer needed, terminate now

Sat Sep 14 18:23:19:607 2019


DpHdlDeadWp: W4 (pid=26024) terminated automatically

Sat Sep 14 18:28:00:694 2019


DpHdlDeadWp: W7 (pid=27637) terminated automatically

Sat Sep 14 18:28:04:153 2019


DpWpDynCreate: created new work process W20-29199

Sat Sep 14 18:28:09:519 2019


DpHdlDeadWp: W19 (pid=27642) terminated automatically

Sat Sep 14 18:28:14:535 2019


DpWpDynCreate: created new work process W17-29218

Sat Sep 14 18:30:58:038 2019


DpWpDynCreate: created new work process W4-30252

Sat Sep 14 18:33:07:392 2019


DpWpDynCreate: created new work process W7-30839

Sat Sep 14 18:33:15:501 2019


DpHdlDeadWp: W17 (pid=29218) terminated automatically
DpWpCheck: dyn W20, pid 29199 no longer needed, terminate now

Sat Sep 14 18:33:16:148 2019


DpHdlDeadWp: W20 (pid=29199) terminated automatically

Sat Sep 14 18:35:59:314 2019


DpHdlDeadWp: W4 (pid=30252) terminated automatically

Sat Sep 14 18:38:08:458 2019


DpHdlDeadWp: W7 (pid=30839) terminated automatically

Sat Sep 14 18:38:15:000 2019


DpWpDynCreate: created new work process W19-369

Sat Sep 14 18:43:04:409 2019


DpWpDynCreate: created new work process W17-1903

Sat Sep 14 18:43:16:599 2019


DpHdlDeadWp: W19 (pid=369) terminated automatically
DpWpDynCreate: created new work process W20-1913

Sat Sep 14 18:48:07:152 2019


DpHdlDeadWp: W17 (pid=1903) terminated automatically

Sat Sep 14 18:48:17:321 2019


DpWpCheck: dyn W20, pid 1913 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=1913) terminated automatically

Sat Sep 14 18:48:17:566 2019


DpWpDynCreate: created new work process W4-3859

Sat Sep 14 18:48:17:947 2019


DpWpDynCreate: created new work process W7-3860

Sat Sep 14 18:53:37:330 2019


DpWpCheck: dyn W4, pid 3859 no longer needed, terminate now
DpWpCheck: dyn W7, pid 3860 no longer needed, terminate now

Sat Sep 14 18:53:37:482 2019


DpHdlDeadWp: W4 (pid=3859) terminated automatically
DpHdlDeadWp: W7 (pid=3860) terminated automatically

Sat Sep 14 19:00:02:959 2019


DpWpDynCreate: created new work process W19-7882

Sat Sep 14 19:05:16:090 2019


DpHdlDeadWp: W19 (pid=7882) terminated automatically

Sat Sep 14 19:06:58:142 2019


DpWpDynCreate: created new work process W17-21109

Sat Sep 14 19:06:58:673 2019


DpWpDynCreate: created new work process W20-21145

Sat Sep 14 19:11:59:881 2019


DpHdlDeadWp: W17 (pid=21109) terminated automatically
DpWpCheck: dyn W20, pid 21145 no longer needed, terminate now

Sat Sep 14 19:12:00:652 2019


DpHdlDeadWp: W20 (pid=21145) terminated automatically

Sat Sep 14 19:16:48:230 2019


DpWpDynCreate: created new work process W4-8891

Sat Sep 14 19:21:57:864 2019


DpHdlDeadWp: W4 (pid=8891) terminated automatically

Sat Sep 14 19:23:08:152 2019


DpWpDynCreate: created new work process W7-11050

Sat Sep 14 19:28:14:529 2019


DpWpDynCreate: created new work process W19-12744

Sat Sep 14 19:28:14:651 2019


DpWpDynCreate: created new work process W17-12745

Sat Sep 14 19:28:17:876 2019


DpWpCheck: dyn W7, pid 11050 no longer needed, terminate now

Sat Sep 14 19:28:18:231 2019


DpHdlDeadWp: W7 (pid=11050) terminated automatically

Sat Sep 14 19:33:15:576 2019


DpHdlDeadWp: W17 (pid=12745) terminated automatically
DpHdlDeadWp: W19 (pid=12744) terminated automatically

Sat Sep 14 19:37:58:318 2019


DpWpDynCreate: created new work process W20-15992

Sat Sep 14 19:43:00:311 2019


DpHdlDeadWp: W20 (pid=15992) terminated automatically
Sat Sep 14 19:43:23:073 2019
DpWpDynCreate: created new work process W4-17605

Sat Sep 14 19:48:17:815 2019


DpWpDynCreate: created new work process W7-19157

Sat Sep 14 19:48:28:611 2019


DpHdlDeadWp: W4 (pid=17605) terminated automatically

Sat Sep 14 19:51:01:007 2019


DpWpDynCreate: created new work process W17-20075

Sat Sep 14 19:51:01:226 2019


DpWpDynCreate: created new work process W19-20076

Sat Sep 14 19:53:37:917 2019


DpWpCheck: dyn W7, pid 19157 no longer needed, terminate now

Sat Sep 14 19:53:38:901 2019


DpHdlDeadWp: W7 (pid=19157) terminated automatically

Sat Sep 14 19:56:03:489 2019


DpWpCheck: dyn W17, pid 20075 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=20076) terminated automatically

Sat Sep 14 19:56:04:120 2019


DpHdlDeadWp: W17 (pid=20075) terminated automatically

Sat Sep 14 19:58:00:021 2019


DpHdlDeadWp: W11 (pid=11454) terminated automatically
DpWpDynCreate: created new work process W11-22396

Sat Sep 14 20:01:47:306 2019


DpWpDynCreate: created new work process W20-23547

Sat Sep 14 20:01:49:132 2019


DpWpDynCreate: created new work process W4-23556

Sat Sep 14 20:02:45:431 2019


DpWpDynCreate: created new work process W7-24685

Sat Sep 14 20:05:56:479 2019


DpWpDynCreate: created new work process W19-30948

Sat Sep 14 20:06:50:080 2019


DpWpCheck: dyn W4, pid 23556 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=23547) terminated automatically

Sat Sep 14 20:06:50:746 2019


DpHdlDeadWp: W4 (pid=23556) terminated automatically

Sat Sep 14 20:07:52:718 2019


DpHdlDeadWp: W7 (pid=24685) terminated automatically

Sat Sep 14 20:08:36:520 2019


DpWpDynCreate: created new work process W17-5693

Sat Sep 14 20:10:57:950 2019


DpWpCheck: dyn W19, pid 30948 no longer needed, terminate now

Sat Sep 14 20:10:58:593 2019


DpHdlDeadWp: W19 (pid=30948) terminated automatically

Sat Sep 14 20:13:37:953 2019


DpWpCheck: dyn W17, pid 5693 no longer needed, terminate now

Sat Sep 14 20:13:38:678 2019


DpHdlDeadWp: W17 (pid=5693) terminated automatically

Sat Sep 14 20:24:00:702 2019


DpWpDynCreate: created new work process W20-27081

Sat Sep 14 20:29:01:817 2019


DpHdlDeadWp: W20 (pid=27081) terminated automatically

Sat Sep 14 20:29:58:226 2019


DpWpDynCreate: created new work process W4-28873

Sat Sep 14 20:31:02:020 2019


DpWpDynCreate: created new work process W7-29181

Sat Sep 14 20:35:00:701 2019


DpHdlDeadWp: W4 (pid=28873) terminated automatically

Sat Sep 14 20:35:11:450 2019


DpWpDynCreate: created new work process W19-30759

Sat Sep 14 20:36:04:138 2019


DpHdlDeadWp: W7 (pid=29181) terminated automatically

Sat Sep 14 20:39:58:249 2019


DpWpDynCreate: created new work process W17-32403

Sat Sep 14 20:40:09:117 2019


DpWpDynCreate: created new work process W20-32417

Sat Sep 14 20:40:16:398 2019


DpHdlDeadWp: W19 (pid=30759) terminated automatically

Sat Sep 14 20:44:59:351 2019


DpHdlDeadWp: W17 (pid=32403) terminated automatically

Sat Sep 14 20:45:06:358 2019


DpWpDynCreate: created new work process W4-1704

Sat Sep 14 20:45:10:133 2019


DpHdlDeadWp: W20 (pid=32417) terminated automatically

Sat Sep 14 20:48:59:212 2019


DpWpDynCreate: created new work process W7-3042

Sat Sep 14 20:50:07:597 2019


DpHdlDeadWp: W4 (pid=1704) terminated automatically

Sat Sep 14 20:52:58:126 2019


DpWpDynCreate: created new work process W19-4571
Sat Sep 14 20:54:18:026 2019
DpWpCheck: dyn W7, pid 3042 no longer needed, terminate now

Sat Sep 14 20:54:19:155 2019


DpHdlDeadWp: W7 (pid=3042) terminated automatically

Sat Sep 14 20:55:07:687 2019


DpWpDynCreate: created new work process W17-5256

Sat Sep 14 20:56:10:410 2019


DpHdlDeadWp: W12 (pid=21159) terminated automatically
DpWpDynCreate: created new work process W12-5545

Sat Sep 14 20:57:59:538 2019


DpHdlDeadWp: W19 (pid=4571) terminated automatically

Sat Sep 14 21:00:11:039 2019


DpHdlDeadWp: W17 (pid=5256) terminated automatically

Sat Sep 14 21:00:21:609 2019


DpWpDynCreate: created new work process W20-7388

Sat Sep 14 21:05:38:043 2019


DpWpCheck: dyn W20, pid 7388 no longer needed, terminate now

Sat Sep 14 21:05:38:983 2019


DpHdlDeadWp: W20 (pid=7388) terminated automatically

Sat Sep 14 21:09:57:975 2019


DpWpDynCreate: created new work process W4-5852

Sat Sep 14 21:14:59:700 2019


DpHdlDeadWp: W4 (pid=5852) terminated automatically

Sat Sep 14 21:22:57:722 2019


DpWpDynCreate: created new work process W7-10485

Sat Sep 14 21:27:58:084 2019


DpWpCheck: dyn W7, pid 10485 no longer needed, terminate now

Sat Sep 14 21:27:58:986 2019


DpHdlDeadWp: W7 (pid=10485) terminated automatically

Sat Sep 14 21:28:58:831 2019


DpWpDynCreate: created new work process W19-12533

Sat Sep 14 21:30:14:956 2019


DpWpDynCreate: created new work process W17-12839

Sat Sep 14 21:34:03:446 2019


DpHdlDeadWp: W19 (pid=12533) terminated automatically

Sat Sep 14 21:35:16:575 2019


DpHdlDeadWp: W17 (pid=12839) terminated automatically

Sat Sep 14 21:36:00:987 2019


DpWpDynCreate: created new work process W20-14506

Sat Sep 14 21:40:19:665 2019


DpWpDynCreate: created new work process W4-16091

Sat Sep 14 21:41:02:709 2019


DpHdlDeadWp: W20 (pid=14506) terminated automatically

Sat Sep 14 21:45:20:210 2019


DpHdlDeadWp: W4 (pid=16091) terminated automatically

Sat Sep 14 21:51:07:989 2019


DpWpDynCreate: created new work process W7-19450

Sat Sep 14 21:56:09:756 2019


DpHdlDeadWp: W7 (pid=19450) terminated automatically

Sat Sep 14 22:00:16:363 2019


DpWpDynCreate: created new work process W19-22587

Sat Sep 14 22:05:18:049 2019


DpWpDynCreate: created new work process W17-6065
DpWpCheck: dyn W19, pid 22587 no longer needed, terminate now

Sat Sep 14 22:05:18:233 2019


DpHdlDeadWp: W19 (pid=22587) terminated automatically

Sat Sep 14 22:05:18:365 2019


DpWpDynCreate: created new work process W20-6069

Sat Sep 14 22:10:28:794 2019


DpWpCheck: dyn W17, pid 6065 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=6069) terminated automatically

Sat Sep 14 22:10:29:307 2019


DpHdlDeadWp: W17 (pid=6065) terminated automatically

Sat Sep 14 22:18:38:605 2019


DpWpDynCreate: created new work process W4-24563

Sat Sep 14 22:23:57:464 2019


DpHdlDeadWp: W4 (pid=24563) terminated automatically

Sat Sep 14 22:23:57:593 2019


DpWpDynCreate: created new work process W7-26283

Sat Sep 14 22:25:07:980 2019


DpWpDynCreate: created new work process W19-26742

Sat Sep 14 22:28:58:180 2019


DpWpCheck: dyn W7, pid 26283 no longer needed, terminate now

Sat Sep 14 22:28:58:663 2019


DpHdlDeadWp: W7 (pid=26283) terminated automatically
DpWpDynCreate: created new work process W7-28006

Sat Sep 14 22:30:13:313 2019


DpHdlDeadWp: W19 (pid=26742) terminated automatically

Sat Sep 14 22:30:14:466 2019


DpWpDynCreate: created new work process W20-28424
Sat Sep 14 22:34:18:190 2019
DpWpCheck: dyn W7, pid 28006 no longer needed, terminate now

Sat Sep 14 22:34:18:998 2019


DpHdlDeadWp: W7 (pid=28006) terminated automatically

Sat Sep 14 22:35:16:121 2019


DpHdlDeadWp: W20 (pid=28424) terminated automatically

Sat Sep 14 22:36:01:092 2019


DpWpDynCreate: created new work process W17-30286

Sat Sep 14 22:40:08:914 2019


DpWpDynCreate: created new work process W4-31601

Sat Sep 14 22:41:02:841 2019


DpHdlDeadWp: W17 (pid=30286) terminated automatically

Sat Sep 14 22:44:01:682 2019


DpWpDynCreate: created new work process W19-753

Sat Sep 14 22:45:09:841 2019


DpHdlDeadWp: W4 (pid=31601) terminated automatically

Sat Sep 14 22:47:59:420 2019


DpWpDynCreate: created new work process W7-2068
DpWpDynCreate: created new work process W20-2069

Sat Sep 14 22:49:18:219 2019


DpWpCheck: dyn W19, pid 753 no longer needed, terminate now

Sat Sep 14 22:49:19:305 2019


DpHdlDeadWp: W19 (pid=753) terminated automatically

Sat Sep 14 22:53:18:227 2019


DpWpCheck: dyn W7, pid 2068 no longer needed, terminate now
DpWpCheck: dyn W20, pid 2069 no longer needed, terminate now

Sat Sep 14 22:53:18:567 2019


DpHdlDeadWp: W7 (pid=2068) terminated automatically

Sat Sep 14 22:53:20:664 2019


DpHdlDeadWp: W20 (pid=2069) terminated automatically

Sat Sep 14 22:54:57:979 2019


DpWpDynCreate: created new work process W17-4657

Sat Sep 14 22:59:58:243 2019


DpHdlDeadWp: W17 (pid=4657) terminated automatically

Sat Sep 14 23:00:07:679 2019


DpWpDynCreate: created new work process W4-6200

Sat Sep 14 23:05:18:131 2019


DpHdlDeadWp: W4 (pid=6200) terminated automatically

Sat Sep 14 23:06:01:275 2019


DpWpDynCreate: created new work process W19-24318
Sat Sep 14 23:11:03:435 2019
DpHdlDeadWp: W19 (pid=24318) terminated automatically

Sat Sep 14 23:16:49:014 2019


DpWpDynCreate: created new work process W7-7318

Sat Sep 14 23:21:56:927 2019


DpHdlDeadWp: W7 (pid=7318) terminated automatically

Sat Sep 14 23:30:15:609 2019


DpWpDynCreate: created new work process W20-11696

Sat Sep 14 23:31:58:590 2019


DpWpDynCreate: created new work process W17-12413

Sat Sep 14 23:35:08:148 2019


DpWpDynCreate: created new work process W4-13267

Sat Sep 14 23:35:17:897 2019


DpHdlDeadWp: W20 (pid=11696) terminated automatically

Sat Sep 14 23:37:00:086 2019


DpHdlDeadWp: W17 (pid=12413) terminated automatically

Sat Sep 14 23:40:10:402 2019


DpHdlDeadWp: W4 (pid=13267) terminated automatically

Sat Sep 14 23:40:16:866 2019


DpWpDynCreate: created new work process W19-14864

Sat Sep 14 23:40:24:484 2019


DpWpDynCreate: created new work process W7-14867

Sat Sep 14 23:45:17:453 2019


DpHdlDeadWp: W19 (pid=14864) terminated automatically

Sat Sep 14 23:45:26:680 2019


DpHdlDeadWp: W7 (pid=14867) terminated automatically

Sat Sep 14 23:50:28:005 2019


DpWpDynCreate: created new work process W20-18017

Sat Sep 14 23:52:05:095 2019


DpWpDynCreate: created new work process W17-18534

Sat Sep 14 23:55:38:348 2019


DpWpCheck: dyn W20, pid 18017 no longer needed, terminate now

Sat Sep 14 23:55:39:249 2019


DpHdlDeadWp: W20 (pid=18017) terminated automatically

Sat Sep 14 23:56:04:741 2019


DpWpDynCreate: created new work process W4-19804

Sat Sep 14 23:57:07:235 2019


DpHdlDeadWp: W17 (pid=18534) terminated automatically

Sun Sep 15 00:00:03:013 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 9->8, VB 1->1, ENQ 0->0, BTC 5->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5
DpWpConf: wp reconfiguration, stop W18, pid 572
DpAdaptWppriv_max_no : 6 -> 4

Sun Sep 15 00:00:04:383 2019


DpHdlDeadWp: W18 (pid=572) terminated automatically

Sun Sep 15 00:00:09:761 2019


DpWpDynCreate: created new work process W19-21308

Sun Sep 15 00:05:10:641 2019


DpHdlDeadWp: W19 (pid=21308) terminated automatically

Sun Sep 15 00:05:13:427 2019


DpWpDynCreate: created new work process W7-28701

Sun Sep 15 00:05:13:732 2019


DpWpDynCreate: created new work process W20-28722

Sun Sep 15 00:10:15:041 2019


DpHdlDeadWp: W7 (pid=28701) terminated automatically
DpHdlDeadWp: W20 (pid=28722) terminated automatically

Sun Sep 15 00:23:16:905 2019


DpWpDynCreate: created new work process W17-5834

Sun Sep 15 00:23:59:307 2019


DpWpDynCreate: created new work process W18-6644

Sun Sep 15 00:28:18:402 2019


DpWpCheck: dyn W17, pid 5834 no longer needed, terminate now

Sun Sep 15 00:28:18:932 2019


DpHdlDeadWp: W17 (pid=5834) terminated automatically

Sun Sep 15 00:29:01:042 2019


DpHdlDeadWp: W18 (pid=6644) terminated automatically

Sun Sep 15 00:55:08:124 2019


DpWpDynCreate: created new work process W19-29562

Sun Sep 15 01:00:18:464 2019


DpWpCheck: dyn W19, pid 29562 no longer needed, terminate now

Sun Sep 15 01:00:19:019 2019


DpHdlDeadWp: W19 (pid=29562) terminated automatically

Sun Sep 15 01:04:57:119 2019


DpHdlDeadWp: W10 (pid=8818) terminated automatically
DpWpDynCreate: created new work process W10-20287

Sun Sep 15 01:05:13:951 2019


DpWpDynCreate: created new work process W7-21230

Sun Sep 15 01:09:52:278 2019


DpWpDynCreate: created new work process W20-7104

Sun Sep 15 01:10:15:875 2019


DpHdlDeadWp: W7 (pid=21230) terminated automatically

Sun Sep 15 01:14:55:099 2019


DpHdlDeadWp: W20 (pid=7104) terminated automatically

Sun Sep 15 01:22:05:657 2019


DpWpDynCreate: created new work process W17-11086
DpWpDynCreate: created new work process W18-11087

Sun Sep 15 01:25:09:578 2019


DpWpDynCreate: created new work process W19-12044

Sun Sep 15 01:27:07:467 2019


DpHdlDeadWp: W17 (pid=11086) terminated automatically
DpWpCheck: dyn W18, pid 11087 no longer needed, terminate now

Sun Sep 15 01:27:08:540 2019


DpHdlDeadWp: W18 (pid=11087) terminated automatically

Sun Sep 15 01:30:16:878 2019


DpHdlDeadWp: W19 (pid=12044) terminated automatically
DpWpDynCreate: created new work process W7-13545

Sun Sep 15 01:35:00:298 2019


DpWpDynCreate: created new work process W20-14964

Sun Sep 15 01:35:17:575 2019


DpHdlDeadWp: W7 (pid=13545) terminated automatically

Sun Sep 15 01:40:01:989 2019


DpHdlDeadWp: W20 (pid=14964) terminated automatically

Sun Sep 15 01:40:15:569 2019


DpWpDynCreate: created new work process W17-16625

Sun Sep 15 01:40:20:739 2019


DpWpDynCreate: created new work process W18-16629

Sun Sep 15 01:41:59:210 2019


DpHdlDeadWp: W9 (pid=11881) terminated automatically
DpWpDynCreate: created new work process W9-17146

Sun Sep 15 01:44:58:931 2019


DpWpDynCreate: created new work process W19-18057

Sun Sep 15 01:45:18:557 2019


DpWpCheck: dyn W17, pid 16625 no longer needed, terminate now

Sun Sep 15 01:45:18:713 2019


DpHdlDeadWp: W17 (pid=16625) terminated automatically

Sun Sep 15 01:45:21:947 2019


DpHdlDeadWp: W18 (pid=16629) terminated automatically

Sun Sep 15 01:50:01:754 2019


DpHdlDeadWp: W19 (pid=18057) terminated automatically

Sun Sep 15 01:50:07:359 2019


DpWpDynCreate: created new work process W7-19856
Sun Sep 15 01:50:16:573 2019
DpWpDynCreate: created new work process W20-19865

Sun Sep 15 01:55:11:816 2019


DpHdlDeadWp: W7 (pid=19856) terminated automatically

Sun Sep 15 01:55:12:050 2019


DpWpDynCreate: created new work process W17-21561

Sun Sep 15 01:55:17:734 2019


DpHdlDeadWp: W20 (pid=19865) terminated automatically

Sun Sep 15 02:00:15:990 2019


DpHdlDeadWp: W17 (pid=21561) terminated automatically

Sun Sep 15 02:00:23:050 2019


DpWpDynCreate: created new work process W18-23527

Sun Sep 15 02:00:29:908 2019


DpHdlDeadWp: W9 (pid=17146) terminated automatically
DpWpDynCreate: created new work process W9-23644

Sun Sep 15 02:05:21:078 2019


DpWpDynCreate: created new work process W19-29526

Sun Sep 15 02:05:27:576 2019


DpHdlDeadWp: W18 (pid=23527) terminated automatically

Sun Sep 15 02:06:01:278 2019


DpWpDynCreate: created new work process W7-31172

Sun Sep 15 02:10:33:169 2019


DpHdlDeadWp: W19 (pid=29526) terminated automatically

Sun Sep 15 02:11:03:414 2019


DpHdlDeadWp: W7 (pid=31172) terminated automatically

Sun Sep 15 02:16:47:949 2019


DpWpDynCreate: created new work process W20-24915

Sun Sep 15 02:21:49:067 2019


DpHdlDeadWp: W20 (pid=24915) terminated automatically

Sun Sep 15 02:25:04:287 2019


DpWpDynCreate: created new work process W17-27714

Sun Sep 15 02:30:08:141 2019


DpHdlDeadWp: W17 (pid=27714) terminated automatically

Sun Sep 15 02:30:18:096 2019


DpWpDynCreate: created new work process W18-29649
DpWpDynCreate: created new work process W19-29650

Sun Sep 15 02:33:03:499 2019


DpWpDynCreate: created new work process W7-30614

Sun Sep 15 02:35:19:170 2019


DpWpCheck: dyn W18, pid 29649 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=29650) terminated automatically
DpHdlDeadWp: W18 (pid=29649) terminated automatically

Sun Sep 15 02:35:51:177 2019


DpWpDynCreate: created new work process W20-31921

Sun Sep 15 02:38:18:673 2019


DpWpCheck: dyn W7, pid 30614 no longer needed, terminate now

Sun Sep 15 02:38:18:805 2019


DpHdlDeadWp: W7 (pid=30614) terminated automatically

Sun Sep 15 02:40:19:183 2019


DpWpDynCreate: created new work process W17-1095
DpWpDynCreate: created new work process W19-1096

Sun Sep 15 02:40:58:044 2019


DpHdlDeadWp: W20 (pid=31921) terminated automatically

Sun Sep 15 02:45:38:685 2019


DpWpCheck: dyn W17, pid 1095 no longer needed, terminate now
DpWpCheck: dyn W19, pid 1096 no longer needed, terminate now

Sun Sep 15 02:45:39:200 2019


DpHdlDeadWp: W17 (pid=1095) terminated automatically

Sun Sep 15 02:45:40:841 2019


DpHdlDeadWp: W19 (pid=1096) terminated automatically

Sun Sep 15 02:47:04:630 2019


DpWpDynCreate: created new work process W18-3655

Sun Sep 15 02:47:59:605 2019


DpWpDynCreate: created new work process W7-4009

Sun Sep 15 02:50:16:339 2019


DpWpDynCreate: created new work process W20-4816

Sun Sep 15 02:52:18:694 2019


DpWpCheck: dyn W18, pid 3655 no longer needed, terminate now

Sun Sep 15 02:52:19:159 2019


DpHdlDeadWp: W18 (pid=3655) terminated automatically

Sun Sep 15 02:53:00:092 2019


DpWpDynCreate: created new work process W17-5743

Sun Sep 15 02:53:11:991 2019


DpHdlDeadWp: W7 (pid=4009) terminated automatically

Sun Sep 15 02:55:11:006 2019


DpWpDynCreate: created new work process W19-6658

Sun Sep 15 02:55:18:699 2019


DpWpCheck: dyn W20, pid 4816 no longer needed, terminate now

Sun Sep 15 02:55:19:159 2019


DpHdlDeadWp: W20 (pid=4816) terminated automatically
Sun Sep 15 02:57:57:722 2019
DpWpDynCreate: created new work process W18-7629

Sun Sep 15 02:58:18:704 2019


DpWpCheck: dyn W17, pid 5743 no longer needed, terminate now

Sun Sep 15 02:58:19:305 2019


DpHdlDeadWp: W17 (pid=5743) terminated automatically

Sun Sep 15 02:59:55:994 2019


DpWpDynCreate: created new work process W7-8462

Sun Sep 15 03:00:18:707 2019


DpWpCheck: dyn W19, pid 6658 no longer needed, terminate now

Sun Sep 15 03:00:20:594 2019


DpHdlDeadWp: W19 (pid=6658) terminated automatically

Sun Sep 15 03:02:58:712 2019


DpWpCheck: dyn W18, pid 7629 no longer needed, terminate now

Sun Sep 15 03:02:59:326 2019


DpHdlDeadWp: W18 (pid=7629) terminated automatically

Sun Sep 15 03:03:00:905 2019


DpWpDynCreate: created new work process W20-13828

Sun Sep 15 03:05:00:758 2019


DpHdlDeadWp: W7 (pid=8462) terminated automatically

Sun Sep 15 03:05:17:659 2019


DpWpDynCreate: created new work process W17-22597

Sun Sep 15 03:08:18:721 2019


DpWpCheck: dyn W20, pid 13828 no longer needed, terminate now

Sun Sep 15 03:08:18:986 2019


DpHdlDeadWp: W20 (pid=13828) terminated automatically

Sun Sep 15 03:10:18:725 2019


DpWpCheck: dyn W17, pid 22597 no longer needed, terminate now

Sun Sep 15 03:10:19:166 2019


DpHdlDeadWp: W17 (pid=22597) terminated automatically

Sun Sep 15 03:15:05:078 2019


DpWpDynCreate: created new work process W19-8994

Sun Sep 15 03:20:06:458 2019


DpHdlDeadWp: W19 (pid=8994) terminated automatically

Sun Sep 15 03:20:18:227 2019


DpWpDynCreate: created new work process W18-10705

Sun Sep 15 03:23:59:988 2019


DpWpDynCreate: created new work process W7-12021

Sun Sep 15 03:25:19:518 2019


DpHdlDeadWp: W18 (pid=10705) terminated automatically
Sun Sep 15 03:26:11:419 2019
DpWpDynCreate: created new work process W20-12649

Sun Sep 15 03:29:02:240 2019


DpHdlDeadWp: W7 (pid=12021) terminated automatically

Sun Sep 15 03:30:17:369 2019


DpWpDynCreate: created new work process W17-13963

Sun Sep 15 03:30:58:612 2019


DpWpDynCreate: created new work process W19-14376

Sun Sep 15 03:31:18:757 2019


DpWpCheck: dyn W20, pid 12649 no longer needed, terminate now

Sun Sep 15 03:31:19:355 2019


DpHdlDeadWp: W20 (pid=12649) terminated automatically

Sun Sep 15 03:35:10:722 2019


DpWpDynCreate: created new work process W18-15644

Sun Sep 15 03:35:18:763 2019


DpWpCheck: dyn W17, pid 13963 no longer needed, terminate now

Sun Sep 15 03:35:19:326 2019


DpHdlDeadWp: W17 (pid=13963) terminated automatically

Sun Sep 15 03:35:59:926 2019


DpHdlDeadWp: W19 (pid=14376) terminated automatically

Sun Sep 15 03:40:10:595 2019


DpWpDynCreate: created new work process W7-17081

Sun Sep 15 03:40:11:339 2019


DpHdlDeadWp: W18 (pid=15644) terminated automatically

Sun Sep 15 03:40:14:828 2019


DpWpDynCreate: created new work process W20-17109

Sun Sep 15 03:40:20:362 2019


DpWpDynCreate: created new work process W17-17132

Sun Sep 15 03:45:18:777 2019


DpWpCheck: dyn W7, pid 17081 no longer needed, terminate now
DpWpCheck: dyn W20, pid 17109 no longer needed, terminate now

Sun Sep 15 03:45:19:623 2019


DpHdlDeadWp: W7 (pid=17081) terminated automatically
DpHdlDeadWp: W20 (pid=17109) terminated automatically

Sun Sep 15 03:45:38:778 2019


DpWpCheck: dyn W17, pid 17132 no longer needed, terminate now

Sun Sep 15 03:45:39:697 2019


DpHdlDeadWp: W17 (pid=17132) terminated automatically

Sun Sep 15 03:46:58:783 2019


DpWpDynCreate: created new work process W19-19274
Sun Sep 15 03:51:07:248 2019
DpWpDynCreate: created new work process W18-20823

Sun Sep 15 03:51:59:993 2019


DpHdlDeadWp: W19 (pid=19274) terminated automatically

Sun Sep 15 03:56:10:889 2019


DpHdlDeadWp: W18 (pid=20823) terminated automatically

Sun Sep 15 03:56:18:914 2019


DpWpDynCreate: created new work process W7-22389

Sun Sep 15 04:01:00:391 2019


DpWpDynCreate: created new work process W20-24381

Sun Sep 15 04:01:01:355 2019


DpWpDynCreate: created new work process W17-24384

Sun Sep 15 04:01:38:797 2019


DpWpCheck: dyn W7, pid 22389 no longer needed, terminate now

Sun Sep 15 04:01:39:314 2019


DpHdlDeadWp: W7 (pid=22389) terminated automatically

Sun Sep 15 04:06:01:780 2019


DpHdlDeadWp: W20 (pid=24381) terminated automatically

Sun Sep 15 04:06:03:100 2019


DpHdlDeadWp: W17 (pid=24384) terminated automatically

Sun Sep 15 04:06:14:735 2019


DpWpDynCreate: created new work process W19-10145

Sun Sep 15 04:11:15:875 2019


DpWpDynCreate: created new work process W18-22784

Sun Sep 15 04:11:18:811 2019


DpWpCheck: dyn W19, pid 10145 no longer needed, terminate now

Sun Sep 15 04:11:19:828 2019


DpHdlDeadWp: W19 (pid=10145) terminated automatically

Sun Sep 15 04:15:53:888 2019


DpWpDynCreate: created new work process W7-24792

Sun Sep 15 04:16:18:819 2019


DpWpCheck: dyn W18, pid 22784 no longer needed, terminate now

Sun Sep 15 04:16:19:146 2019


DpHdlDeadWp: W18 (pid=22784) terminated automatically

Sun Sep 15 04:20:58:804 2019


DpHdlDeadWp: W7 (pid=24792) terminated automatically
DpWpDynCreate: created new work process W20-26652

Sun Sep 15 04:21:16:330 2019


DpWpDynCreate: created new work process W17-26668
Sun Sep 15 04:26:03:409 2019
DpHdlDeadWp: W20 (pid=26652) terminated automatically

Sun Sep 15 04:26:08:690 2019


DpWpDynCreate: created new work process W19-28182

Sun Sep 15 04:26:18:835 2019


DpWpCheck: dyn W17, pid 26668 no longer needed, terminate now

Sun Sep 15 04:26:19:110 2019


DpHdlDeadWp: W17 (pid=26668) terminated automatically

Sun Sep 15 04:30:29:068 2019


DpHdlDeadWp: W10 (pid=20287) terminated automatically
DpWpDynCreate: created new work process W10-31095

Sun Sep 15 04:30:53:902 2019


DpWpDynCreate: created new work process W18-31727

Sun Sep 15 04:31:14:566 2019


DpHdlDeadWp: W19 (pid=28182) terminated automatically

Sun Sep 15 04:31:37:299 2019


DpHdlDeadWp: W12 (pid=5545) terminated automatically
DpWpDynCreate: created new work process W12-462

Sun Sep 15 04:35:04:586 2019


DpHdlDeadWp: W13 (pid=5982) terminated automatically
DpWpDynCreate: created new work process W13-1840

Sun Sep 15 04:35:57:305 2019


DpHdlDeadWp: W18 (pid=31727) terminated automatically

Sun Sep 15 04:36:03:733 2019


DpWpDynCreate: created new work process W7-2205

Sun Sep 15 04:36:10:439 2019


DpWpDynCreate: created new work process W20-2208

Sun Sep 15 04:41:18:677 2019


DpWpCheck: dyn W7, pid 2205 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=2208) terminated automatically
DpHdlDeadWp: W7 (pid=2205) terminated automatically

Sun Sep 15 04:41:19:438 2019


DpWpDynCreate: created new work process W17-4745

Sun Sep 15 04:41:19:688 2019


DpWpDynCreate: created new work process W19-4746

Sun Sep 15 04:46:24:401 2019


DpHdlDeadWp: W17 (pid=4745) terminated automatically
DpHdlDeadWp: W19 (pid=4746) terminated automatically

Sun Sep 15 04:51:06:570 2019


DpWpDynCreate: created new work process W18-8586

Sun Sep 15 04:51:17:216 2019


DpWpDynCreate: created new work process W20-8661
Sun Sep 15 04:51:17:342 2019
DpWpDynCreate: created new work process W7-8662

Sun Sep 15 04:54:24:599 2019


DpHdlDeadWp: W11 (pid=22396) terminated automatically
DpWpDynCreate: created new work process W11-9512

Sun Sep 15 04:56:08:453 2019


DpHdlDeadWp: W18 (pid=8586) terminated automatically

Sun Sep 15 04:56:18:887 2019


DpWpCheck: dyn W7, pid 8662 no longer needed, terminate now
DpWpCheck: dyn W20, pid 8661 no longer needed, terminate now

Sun Sep 15 04:56:19:754 2019


DpHdlDeadWp: W7 (pid=8662) terminated automatically
DpHdlDeadWp: W20 (pid=8661) terminated automatically

Sun Sep 15 05:01:17:336 2019


DpWpDynCreate: created new work process W17-12033

Sun Sep 15 05:02:59:170 2019


DpWpDynCreate: created new work process W19-17876

Sun Sep 15 05:03:38:899 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T9_U17451 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T9_U17451_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |03:21:00|18 |
SAPLSBCS_OUT |high| |
|SOST |
DpHdlSoftCancel: cancel request for T9_U17451_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Sun Sep 15 05:06:18:302 2019


DpHdlDeadWp: W17 (pid=12033) terminated automatically

Sun Sep 15 05:08:01:151 2019


DpHdlDeadWp: W19 (pid=17876) terminated automatically

Sun Sep 15 05:11:16:711 2019


DpWpDynCreate: created new work process W18-10719

Sun Sep 15 05:16:07:327 2019


DpWpDynCreate: created new work process W7-12308

Sun Sep 15 05:16:11:446 2019


DpWpDynCreate: created new work process W20-12351

Sun Sep 15 05:16:18:918 2019


DpWpCheck: dyn W18, pid 10719 no longer needed, terminate now

Sun Sep 15 05:16:19:561 2019


DpHdlDeadWp: W18 (pid=10719) terminated automatically

Sun Sep 15 05:21:11:939 2019


DpHdlDeadWp: W7 (pid=12308) terminated automatically

Sun Sep 15 05:21:12:545 2019


DpHdlDeadWp: W20 (pid=12351) terminated automatically

Sun Sep 15 05:21:16:952 2019


DpWpDynCreate: created new work process W17-14149

Sun Sep 15 05:25:58:842 2019


DpWpDynCreate: created new work process W19-15982

Sun Sep 15 05:26:19:190 2019


DpHdlDeadWp: W17 (pid=14149) terminated automatically

Sun Sep 15 05:27:01:206 2019


DpWpDynCreate: created new work process W18-16371

Sun Sep 15 05:30:59:522 2019


DpHdlDeadWp: W19 (pid=15982) terminated automatically

Sun Sep 15 05:32:18:942 2019


DpWpCheck: dyn W18, pid 16371 no longer needed, terminate now

Sun Sep 15 05:32:19:606 2019


DpHdlDeadWp: W18 (pid=16371) terminated automatically

Sun Sep 15 05:36:10:891 2019


DpWpDynCreate: created new work process W7-19210

Sun Sep 15 05:41:15:510 2019


DpHdlDeadWp: W7 (pid=19210) terminated automatically

Sun Sep 15 05:41:16:250 2019


DpWpDynCreate: created new work process W20-20884

Sun Sep 15 05:46:05:734 2019


DpWpDynCreate: created new work process W17-22514

Sun Sep 15 05:46:18:967 2019


DpWpCheck: dyn W20, pid 20884 no longer needed, terminate now

Sun Sep 15 05:46:19:324 2019


DpHdlDeadWp: W20 (pid=20884) terminated automatically

Sun Sep 15 05:47:59:254 2019


DpWpDynCreate: created new work process W19-23260

Sun Sep 15 05:51:06:651 2019


DpHdlDeadWp: W17 (pid=22514) terminated automatically

Sun Sep 15 05:53:00:419 2019


DpHdlDeadWp: W19 (pid=23260) terminated automatically

Sun Sep 15 05:55:59:297 2019


DpWpDynCreate: created new work process W18-26221

Sun Sep 15 06:00:53:697 2019


DpWpDynCreate: created new work process W7-28016

Sun Sep 15 06:01:02:111 2019


DpHdlDeadWp: W18 (pid=26221) terminated automatically
Sun Sep 15 06:05:57:446 2019
DpHdlDeadWp: W7 (pid=28016) terminated automatically

Sun Sep 15 06:06:06:612 2019


DpWpDynCreate: created new work process W20-13580

Sun Sep 15 06:06:13:271 2019


DpWpDynCreate: created new work process W17-13832

Sun Sep 15 06:11:17:285 2019


DpHdlDeadWp: W17 (pid=13832) terminated automatically
DpWpCheck: dyn W20, pid 13580 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=13580) terminated automatically

Sun Sep 15 06:11:18:302 2019


DpWpDynCreate: created new work process W19-26736

Sun Sep 15 06:13:58:807 2019


DpWpDynCreate: created new work process W18-27742

Sun Sep 15 06:16:10:851 2019


DpWpDynCreate: created new work process W7-28536

Sun Sep 15 06:16:19:011 2019


DpWpCheck: dyn W19, pid 26736 no longer needed, terminate now

Sun Sep 15 06:16:19:902 2019


DpHdlDeadWp: W19 (pid=26736) terminated automatically

Sun Sep 15 06:18:59:017 2019


DpWpCheck: dyn W18, pid 27742 no longer needed, terminate now

Sun Sep 15 06:19:00:925 2019


DpHdlDeadWp: W18 (pid=27742) terminated automatically

Sun Sep 15 06:20:00:455 2019


DpWpDynCreate: created new work process W17-29829

Sun Sep 15 06:21:18:960 2019


DpWpDynCreate: created new work process W20-30145

Sun Sep 15 06:21:19:022 2019


DpWpCheck: dyn W7, pid 28536 no longer needed, terminate now

Sun Sep 15 06:21:19:167 2019


DpHdlDeadWp: W7 (pid=28536) terminated automatically

Sun Sep 15 06:24:59:106 2019


DpWpDynCreate: created new work process W19-31498

Sun Sep 15 06:25:19:030 2019


DpWpCheck: dyn W17, pid 29829 no longer needed, terminate now

Sun Sep 15 06:25:19:207 2019


DpHdlDeadWp: W17 (pid=29829) terminated automatically

Sun Sep 15 06:26:19:032 2019


DpWpCheck: dyn W20, pid 30145 no longer needed, terminate now
Sun Sep 15 06:26:19:314 2019
DpHdlDeadWp: W20 (pid=30145) terminated automatically

Sun Sep 15 06:30:00:366 2019


DpHdlDeadWp: W19 (pid=31498) terminated automatically

Sun Sep 15 06:31:16:667 2019


DpWpDynCreate: created new work process W18-941

Sun Sep 15 06:31:16:971 2019


DpWpDynCreate: created new work process W7-942

Sun Sep 15 06:36:00:326 2019


DpWpDynCreate: created new work process W17-2881

Sun Sep 15 06:36:18:264 2019


DpWpCheck: dyn W7, pid 942 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=941) terminated automatically

Sun Sep 15 06:36:18:844 2019


DpHdlDeadWp: W7 (pid=942) terminated automatically

Sun Sep 15 06:37:00:889 2019


DpWpDynCreate: created new work process W20-3168

Sun Sep 15 06:41:01:538 2019


DpHdlDeadWp: W17 (pid=2881) terminated automatically

Sun Sep 15 06:41:16:309 2019


DpWpDynCreate: created new work process W19-4383
DpWpDynCreate: created new work process W18-4384

Sun Sep 15 06:42:02:120 2019


DpHdlDeadWp: W20 (pid=3168) terminated automatically

Sun Sep 15 06:45:59:064 2019


DpWpDynCreate: created new work process W7-6215

Sun Sep 15 06:46:17:550 2019


DpHdlDeadWp: W19 (pid=4383) terminated automatically

Sun Sep 15 06:46:17:915 2019


DpHdlDeadWp: W18 (pid=4384) terminated automatically

Sun Sep 15 06:51:01:382 2019


DpWpDynCreate: created new work process W17-7887

Sun Sep 15 06:51:03:455 2019


DpHdlDeadWp: W7 (pid=6215) terminated automatically
DpWpDynCreate: created new work process W7-7892

Sun Sep 15 06:56:03:610 2019


DpHdlDeadWp: W17 (pid=7887) terminated automatically

Sun Sep 15 06:56:04:709 2019


DpHdlDeadWp: W7 (pid=7892) terminated automatically

Sun Sep 15 06:56:06:368 2019


DpWpDynCreate: created new work process W20-9539
Sun Sep 15 07:00:55:141 2019
DpWpDynCreate: created new work process W19-11314

Sun Sep 15 07:01:19:091 2019


DpWpCheck: dyn W20, pid 9539 no longer needed, terminate now

Sun Sep 15 07:01:20:098 2019


DpHdlDeadWp: W20 (pid=9539) terminated automatically

Sun Sep 15 07:02:58:974 2019


DpWpDynCreate: created new work process W18-16868

Sun Sep 15 07:05:59:098 2019


DpWpCheck: dyn W19, pid 11314 no longer needed, terminate now

Sun Sep 15 07:05:59:385 2019


DpHdlDeadWp: W19 (pid=11314) terminated automatically

Sun Sep 15 07:06:13:354 2019


DpWpDynCreate: created new work process W17-28733

Sun Sep 15 07:07:59:100 2019


DpWpCheck: dyn W18, pid 16868 no longer needed, terminate now

Sun Sep 15 07:07:59:586 2019


DpHdlDeadWp: W18 (pid=16868) terminated automatically

Sun Sep 15 07:11:16:871 2019


DpHdlDeadWp: W17 (pid=28733) terminated automatically

Sun Sep 15 07:11:17:759 2019


DpWpDynCreate: created new work process W7-10101

Sun Sep 15 07:15:53:446 2019


DpWpDynCreate: created new work process W20-11710

Sun Sep 15 07:16:19:118 2019


DpWpCheck: dyn W7, pid 10101 no longer needed, terminate now

Sun Sep 15 07:16:20:062 2019


DpHdlDeadWp: W7 (pid=10101) terminated automatically

Sun Sep 15 07:19:59:531 2019


DpWpDynCreate: created new work process W19-12918

Sun Sep 15 07:20:59:019 2019


DpHdlDeadWp: W20 (pid=11710) terminated automatically
DpWpDynCreate: created new work process W20-13314
DpWpDynCreate: created new work process W18-13315

Sun Sep 15 07:25:02:112 2019


DpHdlDeadWp: W19 (pid=12918) terminated automatically

Sun Sep 15 07:26:00:293 2019


DpWpCheck: dyn W18, pid 13315 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=13314) terminated automatically

Sun Sep 15 07:26:00:537 2019


DpHdlDeadWp: W18 (pid=13315) terminated automatically

Sun Sep 15 07:26:09:862 2019


DpWpDynCreate: created new work process W17-15116

Sun Sep 15 07:31:10:141 2019


DpHdlDeadWp: W17 (pid=15116) terminated automatically

Sun Sep 15 07:31:16:698 2019


DpWpDynCreate: created new work process W7-16832

Sun Sep 15 07:31:16:949 2019


DpWpDynCreate: created new work process W19-16833

Sun Sep 15 07:34:02:816 2019


DpWpDynCreate: created new work process W20-17735

Sun Sep 15 07:36:17:872 2019


DpHdlDeadWp: W7 (pid=16832) terminated automatically
DpHdlDeadWp: W19 (pid=16833) terminated automatically

Sun Sep 15 07:39:03:512 2019


DpHdlDeadWp: W20 (pid=17735) terminated automatically

Sun Sep 15 07:40:02:844 2019


DpWpDynCreate: created new work process W18-19597

Sun Sep 15 07:41:16:627 2019


DpWpDynCreate: created new work process W17-20027

Sun Sep 15 07:45:05:543 2019


DpHdlDeadWp: W18 (pid=19597) terminated automatically

Sun Sep 15 07:46:19:169 2019


DpWpCheck: dyn W17, pid 20027 no longer needed, terminate now

Sun Sep 15 07:46:19:680 2019


DpHdlDeadWp: W17 (pid=20027) terminated automatically

Sun Sep 15 07:50:40:615 2019


DpHdlDeadWp: W9 (pid=23644) terminated automatically
DpWpDynCreate: created new work process W9-23323

Sun Sep 15 07:50:51:081 2019


DpWpDynCreate: created new work process W7-23348

Sun Sep 15 07:55:59:127 2019


DpWpDynCreate: created new work process W19-25170

Sun Sep 15 07:56:00:503 2019


DpHdlDeadWp: W7 (pid=23348) terminated automatically

Sun Sep 15 07:56:03:748 2019


DpHdlDeadWp: W13 (pid=1840) terminated automatically
DpWpDynCreate: created new work process W13-25178

Sun Sep 15 07:56:12:727 2019


DpWpDynCreate: created new work process W20-25191
Sun Sep 15 08:01:00:590 2019
DpHdlDeadWp: W19 (pid=25170) terminated automatically

Sun Sep 15 08:01:13:226 2019


DpHdlDeadWp: W20 (pid=25191) terminated automatically

Sun Sep 15 08:01:15:752 2019


DpWpDynCreate: created new work process W18-27119

Sun Sep 15 08:06:16:311 2019


DpHdlDeadWp: W18 (pid=27119) terminated automatically

Sun Sep 15 08:11:15:486 2019


DpWpDynCreate: created new work process W17-25651
DpWpDynCreate: created new work process W7-25652

Sun Sep 15 08:16:17:870 2019


DpHdlDeadWp: W7 (pid=25652) terminated automatically
DpWpCheck: dyn W17, pid 25651 no longer needed, terminate now

Sun Sep 15 08:16:18:402 2019


DpHdlDeadWp: W17 (pid=25651) terminated automatically

Sun Sep 15 08:16:49:034 2019


DpWpDynCreate: created new work process W19-27387

Sun Sep 15 08:20:59:543 2019


DpWpDynCreate: created new work process W20-29011

Sun Sep 15 08:21:58:524 2019


DpHdlDeadWp: W19 (pid=27387) terminated automatically

Sun Sep 15 08:22:07:543 2019


DpWpDynCreate: created new work process W18-29462

Sun Sep 15 08:26:01:929 2019


DpHdlDeadWp: W20 (pid=29011) terminated automatically

Sun Sep 15 08:26:08:693 2019


DpWpDynCreate: created new work process W7-30572

Sun Sep 15 08:27:19:240 2019


DpWpCheck: dyn W18, pid 29462 no longer needed, terminate now

Sun Sep 15 08:27:20:080 2019


DpHdlDeadWp: W18 (pid=29462) terminated automatically

Sun Sep 15 08:29:00:717 2019


DpWpDynCreate: created new work process W17-31553

Sun Sep 15 08:31:19:245 2019


DpWpCheck: dyn W7, pid 30572 no longer needed, terminate now

Sun Sep 15 08:31:20:321 2019


DpHdlDeadWp: W7 (pid=30572) terminated automatically

Sun Sep 15 08:34:03:623 2019


DpHdlDeadWp: W17 (pid=31553) terminated automatically
Sun Sep 15 08:35:01:292 2019
DpWpDynCreate: created new work process W19-1107

Sun Sep 15 08:40:02:919 2019


DpHdlDeadWp: W19 (pid=1107) terminated automatically

Sun Sep 15 08:41:16:205 2019


DpWpDynCreate: created new work process W20-3352

Sun Sep 15 08:43:34:223 2019


DpWpDynCreate: created new work process W18-4204

Sun Sep 15 08:46:18:282 2019


DpHdlDeadWp: W20 (pid=3352) terminated automatically

Sun Sep 15 08:48:39:269 2019


DpWpCheck: dyn W18, pid 4204 no longer needed, terminate now

Sun Sep 15 08:48:39:489 2019


DpHdlDeadWp: W18 (pid=4204) terminated automatically

Sun Sep 15 08:51:15:732 2019


DpWpDynCreate: created new work process W7-6911

Sun Sep 15 08:51:16:013 2019


DpWpDynCreate: created new work process W17-6915

Sun Sep 15 08:56:19:285 2019


DpWpCheck: dyn W7, pid 6911 no longer needed, terminate now
DpWpCheck: dyn W17, pid 6915 no longer needed, terminate now
DpHdlDeadWp: W7 (pid=6911) terminated automatically
DpHdlDeadWp: W17 (pid=6915) terminated automatically

Sun Sep 15 08:57:02:970 2019


DpWpDynCreate: created new work process W19-8868

Sun Sep 15 09:01:18:307 2019


DpWpDynCreate: created new work process W20-11762

Sun Sep 15 09:02:03:787 2019


DpHdlDeadWp: W19 (pid=8868) terminated automatically

Sun Sep 15 09:03:59:291 2019


DpWpDynCreate: created new work process W18-22783

Sun Sep 15 09:04:00:001 2019


DpWpDynCreate: created new work process W7-22820

Sun Sep 15 09:06:19:302 2019


DpWpCheck: dyn W20, pid 11762 no longer needed, terminate now

Sun Sep 15 09:06:19:414 2019


DpHdlDeadWp: W20 (pid=11762) terminated automatically

Sun Sep 15 09:09:01:634 2019


DpWpCheck: dyn W7, pid 22820 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=22783) terminated automatically

Sun Sep 15 09:09:02:285 2019


DpHdlDeadWp: W7 (pid=22820) terminated automatically

Sun Sep 15 09:14:02:618 2019


DpWpDynCreate: created new work process W17-9742

Sun Sep 15 09:18:01:539 2019


DpWpDynCreate: created new work process W19-11174

Sun Sep 15 09:19:06:179 2019


DpHdlDeadWp: W17 (pid=9742) terminated automatically

Sun Sep 15 09:22:11:785 2019


DpWpDynCreate: created new work process W20-12470

Sun Sep 15 09:23:19:330 2019


DpWpCheck: dyn W19, pid 11174 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=11174) terminated automatically

Sun Sep 15 09:27:12:798 2019


DpWpDynCreate: created new work process W18-14183

Sun Sep 15 09:27:13:822 2019


DpHdlDeadWp: W20 (pid=12470) terminated automatically

Sun Sep 15 09:28:58:396 2019


DpWpDynCreate: created new work process W7-14875

Sun Sep 15 09:32:13:877 2019


DpHdlDeadWp: W18 (pid=14183) terminated automatically

Sun Sep 15 09:33:59:214 2019


DpWpDynCreate: created new work process W17-16587

Sun Sep 15 09:33:59:347 2019


DpWpCheck: dyn W7, pid 14875 no longer needed, terminate now
DpHdlDeadWp: W7 (pid=14875) terminated automatically

Sun Sep 15 09:33:59:527 2019


DpWpDynCreate: created new work process W19-16589

Sun Sep 15 09:39:01:148 2019


DpWpDynCreate: created new work process W20-18211
DpWpCheck: dyn W17, pid 16587 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=16589) terminated automatically
DpWpDynCreate: created new work process W19-18212
DpHdlDeadWp: W17 (pid=16587) terminated automatically

Sun Sep 15 09:44:03:050 2019


DpHdlDeadWp: W19 (pid=18212) terminated automatically
DpWpCheck: dyn W20, pid 18211 no longer needed, terminate now

Sun Sep 15 09:44:03:923 2019


DpHdlDeadWp: W20 (pid=18211) terminated automatically

Sun Sep 15 09:51:59:397 2019


DpWpDynCreate: created new work process W18-22324

Sun Sep 15 09:52:25:140 2019


DpWpDynCreate: created new work process W7-22637
Sun Sep 15 09:52:25:303 2019
DpWpDynCreate: created new work process W17-22638

Sun Sep 15 09:52:59:989 2019


DpWpDynCreate: created new work process W19-22830

Sun Sep 15 09:57:01:177 2019


DpHdlDeadWp: W18 (pid=22324) terminated automatically

Sun Sep 15 09:57:08:464 2019


DpWpDynCreate: created new work process W20-24357

Sun Sep 15 09:57:39:389 2019


DpWpCheck: dyn W7, pid 22637 no longer needed, terminate now
DpWpCheck: dyn W17, pid 22638 no longer needed, terminate now

Sun Sep 15 09:57:39:757 2019


DpHdlDeadWp: W7 (pid=22637) terminated automatically
DpHdlDeadWp: W17 (pid=22638) terminated automatically

Sun Sep 15 09:58:19:390 2019


DpWpCheck: dyn W19, pid 22830 no longer needed, terminate now

Sun Sep 15 09:58:19:798 2019


DpHdlDeadWp: W19 (pid=22830) terminated automatically

Sun Sep 15 10:00:58:853 2019


DpWpDynCreate: created new work process W18-25820

Sun Sep 15 10:02:10:840 2019


DpHdlDeadWp: W20 (pid=24357) terminated automatically

Sun Sep 15 10:05:01:005 2019


DpWpDynCreate: created new work process W7-8693

Sun Sep 15 10:05:02:537 2019


DpWpDynCreate: created new work process W17-8697

Sun Sep 15 10:05:59:529 2019


DpHdlDeadWp: W18 (pid=25820) terminated automatically

Sun Sep 15 10:06:04:088 2019


DpWpDynCreate: created new work process W19-12122

Sun Sep 15 10:10:02:708 2019


DpHdlDeadWp: W7 (pid=8693) terminated automatically

Sun Sep 15 10:10:19:535 2019


DpWpCheck: dyn W17, pid 8697 no longer needed, terminate now

Sun Sep 15 10:10:19:794 2019


DpHdlDeadWp: W17 (pid=8697) terminated automatically

Sun Sep 15 10:11:19:537 2019


DpWpCheck: dyn W19, pid 12122 no longer needed, terminate now

Sun Sep 15 10:11:19:846 2019


DpHdlDeadWp: W19 (pid=12122) terminated automatically
Sun Sep 15 10:12:40:544 2019
DpWpDynCreate: created new work process W20-25070

Sun Sep 15 10:12:40:838 2019


DpWpDynCreate: created new work process W18-25071

Sun Sep 15 10:17:45:923 2019


DpHdlDeadWp: W18 (pid=25071) terminated automatically
DpWpCheck: dyn W20, pid 25070 no longer needed, terminate now

Sun Sep 15 10:17:47:002 2019


DpHdlDeadWp: W20 (pid=25070) terminated automatically

Sun Sep 15 10:18:07:923 2019


DpWpDynCreate: created new work process W7-26730

Sun Sep 15 10:18:14:150 2019


DpWpDynCreate: created new work process W17-26759

Sun Sep 15 10:20:59:619 2019


DpWpDynCreate: created new work process W19-27816

Sun Sep 15 10:23:19:556 2019


DpWpCheck: dyn W7, pid 26730 no longer needed, terminate now
DpWpCheck: dyn W17, pid 26759 no longer needed, terminate now

Sun Sep 15 10:23:20:327 2019


DpHdlDeadWp: W7 (pid=26730) terminated automatically
DpHdlDeadWp: W17 (pid=26759) terminated automatically

Sun Sep 15 10:23:58:977 2019


DpWpDynCreate: created new work process W18-28704

Sun Sep 15 10:26:04:477 2019


DpHdlDeadWp: W19 (pid=27816) terminated automatically

Sun Sep 15 10:28:59:676 2019


DpHdlDeadWp: W18 (pid=28704) terminated automatically

Sun Sep 15 10:32:21:666 2019


DpWpDynCreate: created new work process W20-31518

Sun Sep 15 10:33:59:028 2019


DpWpDynCreate: created new work process W7-32016

Sun Sep 15 10:37:39:690 2019


DpWpCheck: dyn W20, pid 31518 no longer needed, terminate now

Sun Sep 15 10:37:40:166 2019


DpHdlDeadWp: W20 (pid=31518) terminated automatically

Sun Sep 15 10:38:58:899 2019


DpWpDynCreate: created new work process W17-1287

Sun Sep 15 10:39:01:302 2019


DpHdlDeadWp: W7 (pid=32016) terminated automatically

Sun Sep 15 10:43:07:863 2019


DpWpDynCreate: created new work process W19-2809

Sun Sep 15 10:43:59:700 2019


DpWpCheck: dyn W17, pid 1287 no longer needed, terminate now

Sun Sep 15 10:44:00:385 2019


DpHdlDeadWp: W17 (pid=1287) terminated automatically

Sun Sep 15 10:44:59:083 2019


DpWpDynCreate: created new work process W18-3396

Sun Sep 15 10:48:10:806 2019


DpHdlDeadWp: W19 (pid=2809) terminated automatically

Sun Sep 15 10:50:02:657 2019


DpHdlDeadWp: W18 (pid=3396) terminated automatically

Sun Sep 15 10:52:07:030 2019


DpWpDynCreate: created new work process W20-5896

Sun Sep 15 10:52:18:324 2019


DpWpDynCreate: created new work process W7-5901

Sun Sep 15 10:57:08:549 2019


DpHdlDeadWp: W20 (pid=5896) terminated automatically

Sun Sep 15 10:57:08:871 2019


DpWpDynCreate: created new work process W17-7572

Sun Sep 15 10:57:19:727 2019


DpWpCheck: dyn W7, pid 5901 no longer needed, terminate now

Sun Sep 15 10:57:20:082 2019


DpHdlDeadWp: W7 (pid=5901) terminated automatically

Sun Sep 15 11:02:09:341 2019


DpHdlDeadWp: W17 (pid=7572) terminated automatically

Sun Sep 15 11:02:18:901 2019


DpWpDynCreate: created new work process W19-12863
DpWpDynCreate: created new work process W18-12864

Sun Sep 15 11:07:19:417 2019


DpHdlDeadWp: W18 (pid=12864) terminated automatically
DpHdlDeadWp: W19 (pid=12863) terminated automatically

Sun Sep 15 11:12:10:401 2019


DpWpDynCreate: created new work process W20-8181

Sun Sep 15 11:13:04:082 2019


DpHdlDeadWp: W5 (pid=32124) terminated automatically
DpWpDynCreate: created new work process W5-8468

Sun Sep 15 11:13:14:651 2019


DpWpDynCreate: created new work process W7-8491

Sun Sep 15 11:17:19:765 2019


DpWpCheck: dyn W20, pid 8181 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=8181) terminated automatically
DpWpDynCreate: created new work process W17-10297

Sun Sep 15 11:18:19:766 2019


DpWpCheck: dyn W7, pid 8491 no longer needed, terminate now

Sun Sep 15 11:18:20:107 2019


DpHdlDeadWp: W7 (pid=8491) terminated automatically

Sun Sep 15 11:20:58:236 2019


DpHdlDeadWp: W11 (pid=9512) terminated automatically
DpWpDynCreate: created new work process W11-11431

Sun Sep 15 11:22:21:449 2019


DpHdlDeadWp: W17 (pid=10297) terminated automatically

Sun Sep 15 11:34:05:796 2019


DpWpDynCreate: created new work process W18-15659

Sun Sep 15 11:39:19:804 2019


DpWpCheck: dyn W18, pid 15659 no longer needed, terminate now

Sun Sep 15 11:39:20:092 2019


DpHdlDeadWp: W18 (pid=15659) terminated automatically

Sun Sep 15 11:39:29:100 2019


DpWpDynCreate: created new work process W19-17422

Sun Sep 15 11:44:39:814 2019


DpWpCheck: dyn W19, pid 17422 no longer needed, terminate now

Sun Sep 15 11:44:40:303 2019


DpHdlDeadWp: W19 (pid=17422) terminated automatically

Sun Sep 15 11:45:40:660 2019


DpWpDynCreate: created new work process W20-19532

Sun Sep 15 11:50:42:311 2019


DpHdlDeadWp: W20 (pid=19532) terminated automatically

Sun Sep 15 11:54:48:806 2019


DpWpDynCreate: created new work process W7-22624

Sun Sep 15 11:59:52:930 2019


DpHdlDeadWp: W7 (pid=22624) terminated automatically

Sun Sep 15 12:05:12:058 2019


DpWpDynCreate: created new work process W17-420

Sun Sep 15 12:10:14:017 2019


DpHdlDeadWp: W17 (pid=420) terminated automatically

Sun Sep 15 12:13:38:702 2019


DpWpDynCreate: created new work process W18-24637

Sun Sep 15 12:18:39:877 2019


DpWpCheck: dyn W18, pid 24637 no longer needed, terminate now

Sun Sep 15 12:18:40:536 2019


DpHdlDeadWp: W18 (pid=24637) terminated automatically
Sun Sep 15 12:23:28:479 2019
DpWpDynCreate: created new work process W19-27838

Sun Sep 15 12:23:39:804 2019


DpWpDynCreate: created new work process W20-28020

Sun Sep 15 12:23:39:954 2019


DpWpDynCreate: created new work process W7-28021

Sun Sep 15 12:28:39:893 2019


DpHdlDeadWp: W19 (pid=27838) terminated automatically

Sun Sep 15 12:28:41:013 2019


DpHdlDeadWp: W7 (pid=28021) terminated automatically
DpWpCheck: dyn W20, pid 28020 no longer needed, terminate now

Sun Sep 15 12:28:41:214 2019


DpHdlDeadWp: W20 (pid=28020) terminated automatically

Sun Sep 15 12:34:47:835 2019


DpWpDynCreate: created new work process W17-31567

Sun Sep 15 12:34:48:085 2019


DpWpDynCreate: created new work process W18-31569

Sun Sep 15 12:39:55:889 2019


DpHdlDeadWp: W17 (pid=31567) terminated automatically
DpWpCheck: dyn W18, pid 31569 no longer needed, terminate now

Sun Sep 15 12:39:56:499 2019


DpHdlDeadWp: W18 (pid=31569) terminated automatically

Sun Sep 15 12:44:47:023 2019


DpWpDynCreate: created new work process W19-2383
DpWpDynCreate: created new work process W7-2386

Sun Sep 15 12:49:48:306 2019


DpHdlDeadWp: W7 (pid=2386) terminated automatically
DpHdlDeadWp: W19 (pid=2383) terminated automatically

Sun Sep 15 12:54:41:730 2019


DpWpDynCreate: created new work process W20-5818

Sun Sep 15 12:55:41:763 2019


DpWpDynCreate: created new work process W17-6115

Sun Sep 15 12:59:43:829 2019


DpHdlDeadWp: W20 (pid=5818) terminated automatically

Sun Sep 15 13:00:52:922 2019


DpHdlDeadWp: W17 (pid=6115) terminated automatically

Sun Sep 15 13:04:54:366 2019


DpWpDynCreate: created new work process W18-22950

Sun Sep 15 13:05:50:495 2019


DpWpDynCreate: created new work process W7-25339
Sun Sep 15 13:07:04:646 2019
DpWpDynCreate: created new work process W19-30300

Sun Sep 15 13:09:59:042 2019


DpHdlDeadWp: W18 (pid=22950) terminated automatically

Sun Sep 15 13:10:58:184 2019


DpHdlDeadWp: W7 (pid=25339) terminated automatically

Sun Sep 15 13:12:19:964 2019


DpWpCheck: dyn W19, pid 30300 no longer needed, terminate now

Sun Sep 15 13:12:20:127 2019


DpHdlDeadWp: W19 (pid=30300) terminated automatically

Sun Sep 15 13:15:01:619 2019


DpWpDynCreate: created new work process W20-8102

Sun Sep 15 13:20:04:143 2019


DpHdlDeadWp: W20 (pid=8102) terminated automatically

Sun Sep 15 13:30:17:206 2019


DpWpDynCreate: created new work process W17-13030
DpWpDynCreate: created new work process W18-13031

Sun Sep 15 13:35:20:009 2019


DpWpCheck: dyn W17, pid 13030 no longer needed, terminate now
DpWpCheck: dyn W18, pid 13031 no longer needed, terminate now

Sun Sep 15 13:35:20:352 2019


DpHdlDeadWp: W17 (pid=13030) terminated automatically

Sun Sep 15 13:35:21:908 2019


DpHdlDeadWp: W18 (pid=13031) terminated automatically

Sun Sep 15 13:40:07:873 2019


DpWpDynCreate: created new work process W7-16456

Sun Sep 15 13:40:13:745 2019


DpWpDynCreate: created new work process W19-16459

Sun Sep 15 13:45:08:959 2019


DpHdlDeadWp: W7 (pid=16456) terminated automatically

Sun Sep 15 13:45:14:421 2019


DpHdlDeadWp: W19 (pid=16459) terminated automatically

Sun Sep 15 13:46:05:232 2019


DpWpDynCreate: created new work process W20-18256

Sun Sep 15 13:50:14:315 2019


DpWpDynCreate: created new work process W17-19500

Sun Sep 15 13:51:16:388 2019


DpHdlDeadWp: W20 (pid=18256) terminated automatically

Sun Sep 15 13:53:22:202 2019


DpWpDynCreate: created new work process W18-20407
Sun Sep 15 13:53:22:397 2019
DpWpDynCreate: created new work process W7-20409

Sun Sep 15 13:55:15:122 2019


DpHdlDeadWp: W17 (pid=19500) terminated automatically

Sun Sep 15 13:57:02:395 2019


DpHdlDeadWp: W12 (pid=462) terminated automatically
DpWpDynCreate: created new work process W12-21773

Sun Sep 15 13:58:40:048 2019


DpWpCheck: dyn W7, pid 20409 no longer needed, terminate now
DpWpCheck: dyn W18, pid 20407 no longer needed, terminate now

Sun Sep 15 13:58:40:586 2019


DpHdlDeadWp: W7 (pid=20409) terminated automatically

Sun Sep 15 13:58:43:162 2019


DpHdlDeadWp: W18 (pid=20407) terminated automatically

Sun Sep 15 14:00:17:238 2019


DpWpDynCreate: created new work process W19-22880

Sun Sep 15 14:00:17:655 2019


DpWpDynCreate: created new work process W20-22885

Sun Sep 15 14:05:20:060 2019


DpWpCheck: dyn W19, pid 22880 no longer needed, terminate now
DpWpCheck: dyn W20, pid 22885 no longer needed, terminate now

Sun Sep 15 14:05:20:540 2019


DpHdlDeadWp: W19 (pid=22880) terminated automatically
DpHdlDeadWp: W20 (pid=22885) terminated automatically

Sun Sep 15 14:06:15:124 2019


DpWpDynCreate: created new work process W17-7121

Sun Sep 15 14:11:16:744 2019


DpHdlDeadWp: W17 (pid=7121) terminated automatically

Sun Sep 15 14:16:19:700 2019


DpWpDynCreate: created new work process W7-23525

Sun Sep 15 14:16:30:371 2019


DpWpDynCreate: created new work process W18-23612

Sun Sep 15 14:21:11:220 2019


DpWpDynCreate: created new work process W19-25611

Sun Sep 15 14:21:20:088 2019


DpWpCheck: dyn W7, pid 23525 no longer needed, terminate now

Sun Sep 15 14:21:20:412 2019


DpHdlDeadWp: W7 (pid=23525) terminated automatically

Sun Sep 15 14:21:40:089 2019


DpWpCheck: dyn W18, pid 23612 no longer needed, terminate now

Sun Sep 15 14:21:40:439 2019


DpHdlDeadWp: W18 (pid=23612) terminated automatically

Sun Sep 15 14:22:01:960 2019


DpWpDynCreate: created new work process W20-25918

Sun Sep 15 14:26:20:011 2019


DpHdlDeadWp: W19 (pid=25611) terminated automatically

Sun Sep 15 14:27:03:362 2019


DpHdlDeadWp: W20 (pid=25918) terminated automatically

Sun Sep 15 14:31:11:597 2019


DpWpDynCreate: created new work process W17-28944

Sun Sep 15 14:35:00:898 2019


DpWpDynCreate: created new work process W7-30145
DpWpDynCreate: created new work process W18-30146

Sun Sep 15 14:36:01:845 2019


DpWpDynCreate: created new work process W19-30588

Sun Sep 15 14:36:12:376 2019


DpHdlDeadWp: W17 (pid=28944) terminated automatically

Sun Sep 15 14:40:01:847 2019


DpHdlDeadWp: W7 (pid=30145) terminated automatically
DpHdlDeadWp: W18 (pid=30146) terminated automatically

Sun Sep 15 14:41:02:975 2019


DpHdlDeadWp: W19 (pid=30588) terminated automatically

Sun Sep 15 14:41:23:330 2019


DpWpDynCreate: created new work process W20-32217

Sun Sep 15 14:41:23:581 2019


DpWpDynCreate: created new work process W17-32218

Sun Sep 15 14:46:26:891 2019


DpHdlDeadWp: W17 (pid=32218) terminated automatically
DpWpCheck: dyn W20, pid 32217 no longer needed, terminate now

Sun Sep 15 14:46:27:970 2019


DpHdlDeadWp: W20 (pid=32217) terminated automatically

Sun Sep 15 15:06:13:734 2019


DpWpDynCreate: created new work process W7-24367

Sun Sep 15 15:11:18:488 2019


DpHdlDeadWp: W7 (pid=24367) terminated automatically

Sun Sep 15 15:11:27:779 2019


DpWpDynCreate: created new work process W18-5227

Sun Sep 15 15:16:08:667 2019


DpWpDynCreate: created new work process W19-7133

Sun Sep 15 15:16:39:369 2019


DpHdlDeadWp: W18 (pid=5227) terminated automatically
Sun Sep 15 15:19:59:214 2019
DpWpDynCreate: created new work process W17-8447

Sun Sep 15 15:21:09:801 2019


DpHdlDeadWp: W19 (pid=7133) terminated automatically

Sun Sep 15 15:25:00:209 2019


DpWpCheck: dyn W17, pid 8447 no longer needed, terminate now

Sun Sep 15 15:25:00:696 2019


DpHdlDeadWp: W17 (pid=8447) terminated automatically

Sun Sep 15 15:26:07:352 2019


DpWpDynCreate: created new work process W20-10409

Sun Sep 15 15:31:08:587 2019


DpHdlDeadWp: W20 (pid=10409) terminated automatically

Sun Sep 15 15:38:01:334 2019


DpWpDynCreate: created new work process W7-14148

Sun Sep 15 15:43:03:353 2019


DpHdlDeadWp: W7 (pid=14148) terminated automatically

Sun Sep 15 15:51:00:977 2019


DpWpDynCreate: created new work process W18-18316

Sun Sep 15 15:51:23:497 2019


DpWpDynCreate: created new work process W19-18415

Sun Sep 15 15:56:01:355 2019


DpHdlDeadWp: W18 (pid=18316) terminated automatically

Sun Sep 15 15:56:06:291 2019


DpWpDynCreate: created new work process W17-20061

Sun Sep 15 15:56:40:260 2019


DpWpCheck: dyn W19, pid 18415 no longer needed, terminate now

Sun Sep 15 15:56:40:458 2019


DpHdlDeadWp: W19 (pid=18415) terminated automatically

Sun Sep 15 16:00:59:567 2019


DpWpDynCreate: created new work process W20-21774

Sun Sep 15 16:01:07:565 2019


DpHdlDeadWp: W17 (pid=20061) terminated automatically

Sun Sep 15 16:06:00:283 2019


DpWpCheck: dyn W20, pid 21774 no longer needed, terminate now

Sun Sep 15 16:06:00:616 2019


DpHdlDeadWp: W20 (pid=21774) terminated automatically

Sun Sep 15 16:06:10:781 2019


DpWpDynCreate: created new work process W7-8012

Sun Sep 15 16:11:11:514 2019


DpHdlDeadWp: W7 (pid=8012) terminated automatically
Sun Sep 15 16:11:20:789 2019
DpWpDynCreate: created new work process W18-20355

Sun Sep 15 16:16:02:313 2019


DpWpDynCreate: created new work process W19-22224

Sun Sep 15 16:16:35:706 2019


DpHdlDeadWp: W18 (pid=20355) terminated automatically

Sun Sep 15 16:21:03:369 2019


DpHdlDeadWp: W19 (pid=22224) terminated automatically

Sun Sep 15 16:21:05:823 2019


DpWpDynCreate: created new work process W17-24013

Sun Sep 15 16:21:17:100 2019


DpWpDynCreate: created new work process W20-24069

Sun Sep 15 16:26:07:044 2019


DpHdlDeadWp: W17 (pid=24013) terminated automatically

Sun Sep 15 16:26:20:324 2019


DpWpCheck: dyn W20, pid 24069 no longer needed, terminate now

Sun Sep 15 16:26:21:247 2019


DpHdlDeadWp: W20 (pid=24069) terminated automatically

Sun Sep 15 16:31:20:809 2019


DpWpDynCreate: created new work process W7-27443

Sun Sep 15 16:31:21:044 2019


DpWpDynCreate: created new work process W18-27444

Sun Sep 15 16:36:40:343 2019


DpWpCheck: dyn W7, pid 27443 no longer needed, terminate now
DpWpCheck: dyn W18, pid 27444 no longer needed, terminate now

Sun Sep 15 16:36:40:804 2019


DpHdlDeadWp: W7 (pid=27443) terminated automatically
DpHdlDeadWp: W18 (pid=27444) terminated automatically

Sun Sep 15 16:36:59:227 2019


DpWpDynCreate: created new work process W19-29276

Sun Sep 15 16:41:28:288 2019


DpWpDynCreate: created new work process W17-30864

Sun Sep 15 16:42:00:353 2019


DpWpCheck: dyn W19, pid 29276 no longer needed, terminate now

Sun Sep 15 16:42:01:001 2019


DpHdlDeadWp: W19 (pid=29276) terminated automatically

Sun Sep 15 16:46:40:358 2019


DpWpCheck: dyn W17, pid 30864 no longer needed, terminate now

Sun Sep 15 16:46:41:346 2019


DpHdlDeadWp: W17 (pid=30864) terminated automatically
Sun Sep 15 16:47:00:259 2019
DpWpDynCreate: created new work process W20-32625

Sun Sep 15 16:51:16:685 2019


DpWpDynCreate: created new work process W7-1660

Sun Sep 15 16:51:16:819 2019


DpWpDynCreate: created new work process W18-1661

Sun Sep 15 16:52:01:290 2019


DpHdlDeadWp: W20 (pid=32625) terminated automatically

Sun Sep 15 16:56:17:145 2019


DpHdlDeadWp: W7 (pid=1660) terminated automatically
DpWpCheck: dyn W18, pid 1661 no longer needed, terminate now

Sun Sep 15 16:56:17:263 2019


DpHdlDeadWp: W18 (pid=1661) terminated automatically

Sun Sep 15 17:01:06:420 2019


DpWpDynCreate: created new work process W19-5242

Sun Sep 15 17:06:07:490 2019


DpHdlDeadWp: W19 (pid=5242) terminated automatically

Sun Sep 15 17:06:13:804 2019


DpWpDynCreate: created new work process W17-24005

Sun Sep 15 17:11:20:400 2019


DpWpCheck: dyn W17, pid 24005 no longer needed, terminate now

Sun Sep 15 17:11:20:814 2019


DpHdlDeadWp: W17 (pid=24005) terminated automatically

Sun Sep 15 17:16:07:347 2019


DpWpDynCreate: created new work process W20-5511
DpWpDynCreate: created new work process W7-5512

Sun Sep 15 17:20:53:690 2019


DpHdlDeadWp: W10 (pid=31095) terminated automatically
DpWpDynCreate: created new work process W10-7375

Sun Sep 15 17:21:20:414 2019


DpWpCheck: dyn W7, pid 5512 no longer needed, terminate now
DpWpCheck: dyn W20, pid 5511 no longer needed, terminate now

Sun Sep 15 17:21:21:365 2019


DpHdlDeadWp: W7 (pid=5512) terminated automatically
DpHdlDeadWp: W20 (pid=5511) terminated automatically

Sun Sep 15 17:22:25:554 2019


DpHdlDeadWp: W12 (pid=21773) terminated automatically
DpWpDynCreate: created new work process W12-7911

Sun Sep 15 17:26:11:135 2019


DpWpDynCreate: created new work process W18-8988

Sun Sep 15 17:31:12:630 2019


DpHdlDeadWp: W18 (pid=8988) terminated automatically

Sun Sep 15 17:31:16:490 2019


DpWpDynCreate: created new work process W19-10690

Sun Sep 15 17:36:18:318 2019


DpHdlDeadWp: W19 (pid=10690) terminated automatically

Sun Sep 15 17:41:17:069 2019


DpWpDynCreate: created new work process W17-13913

Sun Sep 15 17:46:12:850 2019


DpWpDynCreate: created new work process W7-15498

Sun Sep 15 17:46:18:633 2019


DpHdlDeadWp: W17 (pid=13913) terminated automatically

Sun Sep 15 17:51:16:595 2019


DpWpDynCreate: created new work process W20-17218

Sun Sep 15 17:51:17:066 2019


DpHdlDeadWp: W7 (pid=15498) terminated automatically

Sun Sep 15 17:51:17:523 2019


DpWpDynCreate: created new work process W18-17221

Sun Sep 15 17:51:17:747 2019


DpWpDynCreate: created new work process W19-17222

Sun Sep 15 17:56:17:315 2019


DpHdlDeadWp: W20 (pid=17218) terminated automatically

Sun Sep 15 17:56:20:477 2019


DpWpCheck: dyn W18, pid 17221 no longer needed, terminate now
DpWpCheck: dyn W19, pid 17222 no longer needed, terminate now

Sun Sep 15 17:56:21:397 2019


DpHdlDeadWp: W18 (pid=17221) terminated automatically
DpHdlDeadWp: W19 (pid=17222) terminated automatically

Sun Sep 15 17:57:01:057 2019


DpWpDynCreate: created new work process W17-18939

Sun Sep 15 18:01:19:145 2019


DpWpDynCreate: created new work process W7-20984

Sun Sep 15 18:02:20:491 2019


DpWpCheck: dyn W17, pid 18939 no longer needed, terminate now

Sun Sep 15 18:02:20:774 2019


DpHdlDeadWp: W17 (pid=18939) terminated automatically

Sun Sep 15 18:04:59:466 2019


DpWpDynCreate: created new work process W20-3411

Sun Sep 15 18:06:20:551 2019


DpHdlDeadWp: W7 (pid=20984) terminated automatically

Sun Sep 15 18:10:00:439 2019


DpHdlDeadWp: W20 (pid=3411) terminated automatically

Sun Sep 15 18:11:17:816 2019


DpWpDynCreate: created new work process W18-19189

Sun Sep 15 18:11:18:077 2019


DpWpDynCreate: created new work process W19-19190

Sun Sep 15 18:16:18:713 2019


DpHdlDeadWp: W18 (pid=19189) terminated automatically

Sun Sep 15 18:16:20:567 2019


DpWpCheck: dyn W19, pid 19190 no longer needed, terminate now

Sun Sep 15 18:16:21:513 2019


DpHdlDeadWp: W19 (pid=19190) terminated automatically

Sun Sep 15 18:21:09:388 2019


DpWpDynCreate: created new work process W17-22602

Sun Sep 15 18:26:10:752 2019


DpHdlDeadWp: W17 (pid=22602) terminated automatically

Sun Sep 15 18:31:16:853 2019


DpWpDynCreate: created new work process W7-26160

Sun Sep 15 18:34:59:290 2019


DpWpDynCreate: created new work process W20-27358

Sun Sep 15 18:35:00:066 2019


DpWpDynCreate: created new work process W18-27361
DpWpDynCreate: created new work process W19-27363

Sun Sep 15 18:36:17:620 2019


DpHdlDeadWp: W7 (pid=26160) terminated automatically

Sun Sep 15 18:40:00:605 2019


DpWpCheck: dyn W20, pid 27358 no longer needed, terminate now

Sun Sep 15 18:40:00:816 2019


DpHdlDeadWp: W20 (pid=27358) terminated automatically

Sun Sep 15 18:40:20:605 2019


DpWpCheck: dyn W18, pid 27361 no longer needed, terminate now
DpWpCheck: dyn W19, pid 27363 no longer needed, terminate now

Sun Sep 15 18:40:20:910 2019


DpHdlDeadWp: W18 (pid=27361) terminated automatically
DpHdlDeadWp: W19 (pid=27363) terminated automatically

Sun Sep 15 18:40:59:979 2019


DpWpDynCreate: created new work process W17-29211

Sun Sep 15 18:41:00:385 2019


DpWpDynCreate: created new work process W7-29215

Sun Sep 15 18:46:01:445 2019


DpWpCheck: dyn W7, pid 29215 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=29211) terminated automatically
Sun Sep 15 18:46:02:447 2019
DpHdlDeadWp: W7 (pid=29215) terminated automatically

Sun Sep 15 18:47:14:085 2019


DpWpDynCreate: created new work process W20-31324

Sun Sep 15 18:51:05:854 2019


DpWpDynCreate: created new work process W18-32514

Sun Sep 15 18:52:20:622 2019


DpWpCheck: dyn W20, pid 31324 no longer needed, terminate now

Sun Sep 15 18:52:21:391 2019


DpHdlDeadWp: W20 (pid=31324) terminated automatically

Sun Sep 15 18:56:06:921 2019


DpHdlDeadWp: W18 (pid=32514) terminated automatically

Sun Sep 15 18:56:11:246 2019


DpWpDynCreate: created new work process W19-1949

Sun Sep 15 19:00:59:661 2019


DpWpDynCreate: created new work process W17-3633

Sun Sep 15 19:01:12:811 2019


DpHdlDeadWp: W19 (pid=1949) terminated automatically

Sun Sep 15 19:01:17:440 2019


DpWpDynCreate: created new work process W7-3645

Sun Sep 15 19:06:00:566 2019


DpHdlDeadWp: W17 (pid=3633) terminated automatically

Sun Sep 15 19:06:14:649 2019


DpWpDynCreate: created new work process W20-8435

Sun Sep 15 19:06:20:649 2019


DpWpCheck: dyn W7, pid 3645 no longer needed, terminate now

Sun Sep 15 19:06:20:803 2019


DpHdlDeadWp: W7 (pid=3645) terminated automatically

Sun Sep 15 19:11:17:823 2019


DpHdlDeadWp: W20 (pid=8435) terminated automatically

Sun Sep 15 19:12:03:033 2019


DpWpDynCreate: created new work process W18-30573

Sun Sep 15 19:12:03:206 2019


DpWpDynCreate: created new work process W19-30574

Sun Sep 15 19:17:04:972 2019


DpHdlDeadWp: W18 (pid=30573) terminated automatically
DpWpCheck: dyn W19, pid 30574 no longer needed, terminate now

Sun Sep 15 19:17:06:062 2019


DpHdlDeadWp: W19 (pid=30574) terminated automatically
Sun Sep 15 19:21:17:229 2019
DpWpDynCreate: created new work process W17-6151

Sun Sep 15 19:25:59:744 2019


DpWpDynCreate: created new work process W7-7780

Sun Sep 15 19:26:00:741 2019


DpWpDynCreate: created new work process W20-7783

Sun Sep 15 19:26:20:678 2019


DpWpCheck: dyn W17, pid 6151 no longer needed, terminate now

Sun Sep 15 19:26:21:557 2019


DpHdlDeadWp: W17 (pid=6151) terminated automatically

Sun Sep 15 19:31:00:686 2019


DpWpCheck: dyn W7, pid 7780 no longer needed, terminate now
DpHdlDeadWp: W7 (pid=7780) terminated automatically

Sun Sep 15 19:31:01:426 2019


DpHdlDeadWp: W20 (pid=7783) terminated automatically

Sun Sep 15 19:31:16:427 2019


DpWpDynCreate: created new work process W18-9508

Sun Sep 15 19:36:16:545 2019


DpWpDynCreate: created new work process W19-11076

Sun Sep 15 19:36:18:654 2019


DpHdlDeadWp: W18 (pid=9508) terminated automatically

Sun Sep 15 19:41:17:163 2019


DpHdlDeadWp: W19 (pid=11076) terminated automatically

Sun Sep 15 19:41:18:194 2019


DpWpDynCreate: created new work process W17-12661
DpWpDynCreate: created new work process W7-12662

Sun Sep 15 19:46:20:713 2019


DpWpCheck: dyn W7, pid 12662 no longer needed, terminate now
DpWpCheck: dyn W17, pid 12661 no longer needed, terminate now

Sun Sep 15 19:46:21:443 2019


DpHdlDeadWp: W7 (pid=12662) terminated automatically
DpHdlDeadWp: W17 (pid=12661) terminated automatically

Sun Sep 15 19:51:09:777 2019


DpWpDynCreate: created new work process W20-16033

Sun Sep 15 19:51:17:693 2019


DpWpDynCreate: created new work process W18-16079

Sun Sep 15 19:51:17:909 2019


DpWpDynCreate: created new work process W19-16080

Sun Sep 15 19:51:19:204 2019


DpWpDynCreate: created new work process W7-16085

Sun Sep 15 19:56:07:184 2019


DpWpDynCreate: created new work process W17-17625

Sun Sep 15 19:56:16:860 2019


DpHdlDeadWp: W20 (pid=16033) terminated automatically

Sun Sep 15 19:56:20:730 2019


DpWpCheck: dyn W7, pid 16085 no longer needed, terminate now
DpWpCheck: dyn W18, pid 16079 no longer needed, terminate now
DpWpCheck: dyn W19, pid 16080 no longer needed, terminate now

Sun Sep 15 19:56:21:039 2019


DpHdlDeadWp: W7 (pid=16085) terminated automatically
DpHdlDeadWp: W18 (pid=16079) terminated automatically
DpHdlDeadWp: W19 (pid=16080) terminated automatically

Sun Sep 15 19:58:01:732 2019


DpHdlDeadWp: W11 (pid=11431) terminated automatically
DpWpDynCreate: created new work process W11-18162

Sun Sep 15 20:01:20:737 2019


DpWpCheck: dyn W17, pid 17625 no longer needed, terminate now

Sun Sep 15 20:01:20:994 2019


DpHdlDeadWp: W17 (pid=17625) terminated automatically

Sun Sep 15 20:01:32:804 2019


DpWpDynCreate: created new work process W20-19872

Sun Sep 15 20:05:46:881 2019


DpWpDynCreate: created new work process W7-28974

Sun Sep 15 20:06:33:774 2019


DpHdlDeadWp: W20 (pid=19872) terminated automatically

Sun Sep 15 20:10:48:553 2019


DpHdlDeadWp: W7 (pid=28974) terminated automatically

Sun Sep 15 20:12:25:575 2019


DpWpDynCreate: created new work process W18-8270
DpWpDynCreate: created new work process W19-8271

Sun Sep 15 20:17:40:765 2019


DpWpCheck: dyn W18, pid 8270 no longer needed, terminate now
DpWpCheck: dyn W19, pid 8271 no longer needed, terminate now

Sun Sep 15 20:17:40:944 2019


DpHdlDeadWp: W18 (pid=8270) terminated automatically
DpHdlDeadWp: W19 (pid=8271) terminated automatically

Sun Sep 15 20:21:05:477 2019


DpWpDynCreate: created new work process W17-21579

Sun Sep 15 20:26:08:759 2019


DpHdlDeadWp: W17 (pid=21579) terminated automatically

Sun Sep 15 20:26:44:496 2019


DpWpDynCreate: created new work process W20-23668

Sun Sep 15 20:31:59:060 2019


DpHdlDeadWp: W20 (pid=23668) terminated automatically

Sun Sep 15 20:32:19:401 2019


DpWpDynCreate: created new work process W7-25526

Sun Sep 15 20:32:19:622 2019


DpWpDynCreate: created new work process W18-25527

Sun Sep 15 20:37:20:803 2019


DpWpCheck: dyn W18, pid 25527 no longer needed, terminate now

Sun Sep 15 20:37:21:664 2019


DpHdlDeadWp: W7 (pid=25526) terminated automatically
DpHdlDeadWp: W18 (pid=25527) terminated automatically

Sun Sep 15 20:42:18:207 2019


DpWpDynCreate: created new work process W19-28844

Sun Sep 15 20:46:00:832 2019


DpWpDynCreate: created new work process W17-30234

Sun Sep 15 20:47:20:820 2019


DpWpCheck: dyn W19, pid 28844 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=28844) terminated automatically

Sun Sep 15 20:51:02:325 2019


DpHdlDeadWp: W17 (pid=30234) terminated automatically

Sun Sep 15 20:51:08:864 2019


DpWpDynCreate: created new work process W20-31771

Sun Sep 15 20:55:08:124 2019


DpHdlDeadWp: W10 (pid=7375) terminated automatically
DpWpDynCreate: created new work process W10-870

Sun Sep 15 20:56:04:625 2019


DpWpDynCreate: created new work process W7-1166

Sun Sep 15 20:56:10:277 2019


DpHdlDeadWp: W20 (pid=31771) terminated automatically

Sun Sep 15 20:57:11:957 2019


DpWpDynCreate: created new work process W18-1519

Sun Sep 15 21:01:07:798 2019


DpHdlDeadWp: W7 (pid=1166) terminated automatically

Sun Sep 15 21:02:18:713 2019


DpHdlDeadWp: W18 (pid=1519) terminated automatically

Sun Sep 15 21:02:19:915 2019


DpWpDynCreate: created new work process W19-7494

Sun Sep 15 21:07:20:853 2019


DpWpCheck: dyn W19, pid 7494 no longer needed, terminate now

Sun Sep 15 21:07:21:186 2019


DpHdlDeadWp: W19 (pid=7494) terminated automatically
Sun Sep 15 21:12:18:786 2019
DpWpDynCreate: created new work process W17-2120

Sun Sep 15 21:12:19:059 2019


DpWpDynCreate: created new work process W20-2121

Sun Sep 15 21:17:20:877 2019


DpWpCheck: dyn W17, pid 2120 no longer needed, terminate now
DpWpCheck: dyn W20, pid 2121 no longer needed, terminate now

Sun Sep 15 21:17:21:689 2019


DpHdlDeadWp: W17 (pid=2120) terminated automatically
DpHdlDeadWp: W20 (pid=2121) terminated automatically

Sun Sep 15 21:19:03:617 2019


DpWpDynCreate: created new work process W7-4674

Sun Sep 15 21:22:17:215 2019


DpWpDynCreate: created new work process W18-5887

Sun Sep 15 21:24:20:889 2019


DpWpCheck: dyn W7, pid 4674 no longer needed, terminate now

Sun Sep 15 21:24:21:093 2019


DpHdlDeadWp: W7 (pid=4674) terminated automatically

Sun Sep 15 21:27:10:371 2019


DpWpDynCreate: created new work process W19-7667

Sun Sep 15 21:27:10:686 2019


DpWpDynCreate: created new work process W17-7668

Sun Sep 15 21:27:20:894 2019


DpWpCheck: dyn W18, pid 5887 no longer needed, terminate now

Sun Sep 15 21:27:21:309 2019


DpHdlDeadWp: W18 (pid=5887) terminated automatically

Sun Sep 15 21:32:11:575 2019


DpHdlDeadWp: W17 (pid=7668) terminated automatically

Sun Sep 15 21:32:11:749 2019


DpHdlDeadWp: W19 (pid=7667) terminated automatically

Sun Sep 15 21:32:16:679 2019


DpWpDynCreate: created new work process W20-9289

Sun Sep 15 21:36:59:311 2019


DpWpDynCreate: created new work process W7-10822

Sun Sep 15 21:37:17:103 2019


DpHdlDeadWp: W20 (pid=9289) terminated automatically

Sun Sep 15 21:41:48:208 2019


DpWpDynCreate: created new work process W18-12533

Sun Sep 15 21:42:00:920 2019


DpWpCheck: dyn W7, pid 10822 no longer needed, terminate now
Sun Sep 15 21:42:01:095 2019
DpHdlDeadWp: W7 (pid=10822) terminated automatically

Sun Sep 15 21:46:58:565 2019


DpHdlDeadWp: W18 (pid=12533) terminated automatically

Sun Sep 15 21:47:14:844 2019


DpWpDynCreate: created new work process W17-13994

Sun Sep 15 21:52:02:970 2019


DpWpDynCreate: created new work process W19-15823

Sun Sep 15 21:52:16:388 2019


DpHdlDeadWp: W17 (pid=13994) terminated automatically

Sun Sep 15 21:52:19:666 2019


DpWpDynCreate: created new work process W20-16015

Sun Sep 15 21:57:20:946 2019


DpWpCheck: dyn W19, pid 15823 no longer needed, terminate now
DpWpCheck: dyn W20, pid 16015 no longer needed, terminate now

Sun Sep 15 21:57:21:987 2019


DpHdlDeadWp: W19 (pid=15823) terminated automatically
DpHdlDeadWp: W20 (pid=16015) terminated automatically

Sun Sep 15 21:57:59:820 2019


DpWpDynCreate: created new work process W7-17677

Sun Sep 15 22:02:18:520 2019


DpWpDynCreate: created new work process W18-23268

Sun Sep 15 22:03:00:956 2019


DpWpCheck: dyn W7, pid 17677 no longer needed, terminate now

Sun Sep 15 22:03:01:170 2019


DpHdlDeadWp: W7 (pid=17677) terminated automatically

Sun Sep 15 22:07:20:964 2019


DpWpCheck: dyn W18, pid 23268 no longer needed, terminate now

Sun Sep 15 22:07:21:094 2019


DpHdlDeadWp: W18 (pid=23268) terminated automatically

Sun Sep 15 22:11:00:977 2019


DpWpDynCreate: created new work process W17-17496

Sun Sep 15 22:11:02:903 2019


DpWpDynCreate: created new work process W19-17507

Sun Sep 15 22:11:11:236 2019


DpWpDynCreate: created new work process W20-17643

Sun Sep 15 22:16:03:748 2019


DpWpCheck: dyn W17, pid 17496 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=17507) terminated automatically

Sun Sep 15 22:16:04:402 2019


DpHdlDeadWp: W17 (pid=17496) terminated automatically
Sun Sep 15 22:16:20:981 2019
DpWpCheck: dyn W20, pid 17643 no longer needed, terminate now

Sun Sep 15 22:16:21:866 2019


DpHdlDeadWp: W20 (pid=17643) terminated automatically

Sun Sep 15 22:16:33:021 2019


DpWpDynCreate: created new work process W7-19320

Sun Sep 15 22:21:00:947 2019


DpWpDynCreate: created new work process W18-21097

Sun Sep 15 22:21:40:989 2019


DpWpCheck: dyn W7, pid 19320 no longer needed, terminate now

Sun Sep 15 22:21:41:120 2019


DpHdlDeadWp: W7 (pid=19320) terminated automatically

Sun Sep 15 22:22:17:555 2019


DpWpDynCreate: created new work process W19-21487

Sun Sep 15 22:26:04:084 2019


DpHdlDeadWp: W18 (pid=21097) terminated automatically

Sun Sep 15 22:27:09:853 2019


DpWpDynCreate: created new work process W17-22871

Sun Sep 15 22:27:20:995 2019


DpWpCheck: dyn W19, pid 21487 no longer needed, terminate now

Sun Sep 15 22:27:21:460 2019


DpHdlDeadWp: W19 (pid=21487) terminated automatically

Sun Sep 15 22:32:11:919 2019


DpHdlDeadWp: W17 (pid=22871) terminated automatically

Sun Sep 15 22:32:19:391 2019


DpWpDynCreate: created new work process W20-24839

Sun Sep 15 22:36:01:106 2019


DpWpDynCreate: created new work process W7-26020

Sun Sep 15 22:37:21:048 2019


DpHdlDeadWp: W20 (pid=24839) terminated automatically

Sun Sep 15 22:41:00:997 2019


DpWpDynCreate: created new work process W18-27760

Sun Sep 15 22:41:02:647 2019


DpHdlDeadWp: W7 (pid=26020) terminated automatically

Sun Sep 15 22:41:09:823 2019


DpWpDynCreate: created new work process W19-27767

Sun Sep 15 22:46:01:063 2019


DpWpCheck: dyn W18, pid 27760 no longer needed, terminate now

Sun Sep 15 22:46:01:281 2019


DpHdlDeadWp: W18 (pid=27760) terminated automatically

Sun Sep 15 22:46:21:063 2019


DpWpCheck: dyn W19, pid 27767 no longer needed, terminate now

Sun Sep 15 22:46:21:423 2019


DpHdlDeadWp: W19 (pid=27767) terminated automatically

Sun Sep 15 22:47:08:079 2019


DpWpDynCreate: created new work process W17-29793

Sun Sep 15 22:47:09:089 2019


DpWpDynCreate: created new work process W20-29796

Sun Sep 15 22:52:09:344 2019


DpHdlDeadWp: W17 (pid=29793) terminated automatically

Sun Sep 15 22:52:10:363 2019


DpHdlDeadWp: W20 (pid=29796) terminated automatically

Sun Sep 15 22:52:14:996 2019


DpWpDynCreate: created new work process W7-31444

Sun Sep 15 22:55:05:370 2019


DpWpDynCreate: created new work process W18-32385

Sun Sep 15 22:57:17:824 2019


DpHdlDeadWp: W7 (pid=31444) terminated automatically

Sun Sep 15 23:00:21:088 2019


DpWpCheck: dyn W18, pid 32385 no longer needed, terminate now

Sun Sep 15 23:00:22:148 2019


DpHdlDeadWp: W18 (pid=32385) terminated automatically

Sun Sep 15 23:01:56:329 2019


DpWpDynCreate: created new work process W19-5032

Sun Sep 15 23:06:58:221 2019


DpHdlDeadWp: W19 (pid=5032) terminated automatically

Sun Sep 15 23:07:15:823 2019


DpWpDynCreate: created new work process W17-22448

Sun Sep 15 23:12:17:287 2019


DpHdlDeadWp: W17 (pid=22448) terminated automatically

Sun Sep 15 23:12:17:995 2019


DpWpDynCreate: created new work process W20-1146

Sun Sep 15 23:17:21:118 2019


DpWpCheck: dyn W20, pid 1146 no longer needed, terminate now

Sun Sep 15 23:17:21:997 2019


DpHdlDeadWp: W20 (pid=1146) terminated automatically

Sun Sep 15 23:22:16:759 2019


DpWpDynCreate: created new work process W7-4750
Sun Sep 15 23:27:00:646 2019
DpWpDynCreate: created new work process W18-6405

Sun Sep 15 23:27:10:226 2019


DpWpDynCreate: created new work process W19-6520

Sun Sep 15 23:27:21:137 2019


DpWpCheck: dyn W7, pid 4750 no longer needed, terminate now

Sun Sep 15 23:27:21:513 2019


DpHdlDeadWp: W7 (pid=4750) terminated automatically

Sun Sep 15 23:32:02:917 2019


DpHdlDeadWp: W18 (pid=6405) terminated automatically

Sun Sep 15 23:32:11:782 2019


DpHdlDeadWp: W19 (pid=6520) terminated automatically

Sun Sep 15 23:32:18:485 2019


DpWpDynCreate: created new work process W17-8152

Sun Sep 15 23:37:10:536 2019


DpWpDynCreate: created new work process W20-9891

Sun Sep 15 23:37:10:692 2019


DpWpDynCreate: created new work process W7-9892

Sun Sep 15 23:37:21:154 2019


DpWpCheck: dyn W17, pid 8152 no longer needed, terminate now

Sun Sep 15 23:37:22:242 2019


DpHdlDeadWp: W17 (pid=8152) terminated automatically

Sun Sep 15 23:42:11:731 2019


DpHdlDeadWp: W7 (pid=9892) terminated automatically
DpWpCheck: dyn W20, pid 9891 no longer needed, terminate now

Sun Sep 15 23:42:11:875 2019


DpHdlDeadWp: W20 (pid=9891) terminated automatically

Sun Sep 15 23:42:16:739 2019


DpWpDynCreate: created new work process W18-11312

Sun Sep 15 23:42:17:034 2019


DpWpDynCreate: created new work process W19-11313

Sun Sep 15 23:47:17:861 2019


DpHdlDeadWp: W18 (pid=11312) terminated automatically

Sun Sep 15 23:47:21:170 2019


DpWpCheck: dyn W19, pid 11313 no longer needed, terminate now

Sun Sep 15 23:47:21:515 2019


DpHdlDeadWp: W19 (pid=11313) terminated automatically

Sun Sep 15 23:48:03:238 2019


DpWpDynCreate: created new work process W17-13391

Sun Sep 15 23:52:19:335 2019


DpWpDynCreate: created new work process W7-14794

Sun Sep 15 23:53:05:907 2019


DpHdlDeadWp: W17 (pid=13391) terminated automatically

Sun Sep 15 23:57:00:872 2019


DpWpDynCreate: created new work process W20-16585

Sun Sep 15 23:57:39:821 2019


DpHdlDeadWp: W7 (pid=14794) terminated automatically

Mon Sep 16 00:00:04:803 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 9->8, VB 1->1, ENQ 0->0, BTC 5->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5
DpWpConf: wp reconfiguration, stop W20, pid 16585
DpAdaptWppriv_max_no : 4 -> 4
DpHdlDeadWp: W20 (pid=16585) terminated automatically

Mon Sep 16 00:02:22:810 2019


DpWpDynCreate: created new work process W18-18369

Mon Sep 16 00:07:14:562 2019


DpWpDynCreate: created new work process W19-1481

Mon Sep 16 00:07:32:011 2019


DpHdlDeadWp: W18 (pid=18369) terminated automatically

Mon Sep 16 00:09:04:699 2019


DpWpDynCreate: created new work process W17-3794

Mon Sep 16 00:12:15:634 2019


DpHdlDeadWp: W19 (pid=1481) terminated automatically

Mon Sep 16 00:12:21:810 2019


DpWpDynCreate: created new work process W7-16384

Mon Sep 16 00:12:23:626 2019


DpWpDynCreate: created new work process W20-16436

Mon Sep 16 00:14:21:217 2019


DpWpCheck: dyn W17, pid 3794 no longer needed, terminate now

Mon Sep 16 00:14:21:750 2019


DpHdlDeadWp: W17 (pid=3794) terminated automatically

Mon Sep 16 00:17:41:223 2019


DpWpCheck: dyn W7, pid 16384 no longer needed, terminate now
DpWpCheck: dyn W20, pid 16436 no longer needed, terminate now

Mon Sep 16 00:17:41:654 2019


DpHdlDeadWp: W7 (pid=16384) terminated automatically
DpHdlDeadWp: W20 (pid=16436) terminated automatically

Mon Sep 16 00:18:03:427 2019


DpWpDynCreate: created new work process W18-23311

Mon Sep 16 00:20:04:767 2019


DpWpDynCreate: created new work process W19-28418
Mon Sep 16 00:23:11:414 2019
DpHdlDeadWp: W18 (pid=23311) terminated automatically

Mon Sep 16 00:25:05:910 2019


DpHdlDeadWp: W19 (pid=28418) terminated automatically

Mon Sep 16 00:27:10:679 2019


DpWpDynCreate: created new work process W17-10813

Mon Sep 16 00:32:11:006 2019


DpWpDynCreate: created new work process W7-13782

Mon Sep 16 00:32:12:276 2019


DpHdlDeadWp: W17 (pid=10813) terminated automatically

Mon Sep 16 00:37:21:260 2019


DpWpCheck: dyn W7, pid 13782 no longer needed, terminate now

Mon Sep 16 00:37:21:719 2019


DpHdlDeadWp: W7 (pid=13782) terminated automatically

Mon Sep 16 00:40:05:212 2019


DpWpDynCreate: created new work process W20-16994

Mon Sep 16 00:45:21:281 2019


DpWpCheck: dyn W20, pid 16994 no longer needed, terminate now

Mon Sep 16 00:45:22:157 2019


DpHdlDeadWp: W20 (pid=16994) terminated automatically

Mon Sep 16 00:47:11:980 2019


DpWpDynCreate: created new work process W18-19400

Mon Sep 16 00:52:21:293 2019


DpWpCheck: dyn W18, pid 19400 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=19400) terminated automatically

Mon Sep 16 00:54:23:442 2019


DpWpDynCreate: created new work process W19-2074

Mon Sep 16 00:59:41:306 2019


DpWpCheck: dyn W19, pid 2074 no longer needed, terminate now

Mon Sep 16 00:59:41:886 2019


DpHdlDeadWp: W19 (pid=2074) terminated automatically

Mon Sep 16 01:01:48:030 2019


DpWpDynCreate: created new work process W17-7024

Mon Sep 16 01:01:48:270 2019


DpWpDynCreate: created new work process W7-7044

Mon Sep 16 01:04:45:374 2019


DpHdlDeadWp: W11 (pid=18162) terminated automatically
DpWpDynCreate: created new work process W11-13212

Mon Sep 16 01:06:58:560 2019


DpWpCheck: dyn W7, pid 7044 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=7024) terminated automatically

Mon Sep 16 01:06:59:620 2019


DpHdlDeadWp: W7 (pid=7044) terminated automatically

Mon Sep 16 01:07:27:545 2019


DpWpDynCreate: created new work process W20-21876

Mon Sep 16 01:09:03:472 2019


DpWpDynCreate: created new work process W18-29076

Mon Sep 16 01:12:17:735 2019


DpWpDynCreate: created new work process W19-3822

Mon Sep 16 01:12:36:158 2019


DpHdlDeadWp: W20 (pid=21876) terminated automatically
DpWpDynCreate: created new work process W20-4121

Mon Sep 16 01:12:39:442 2019


DpWpDynCreate: created new work process W17-4127

Mon Sep 16 01:14:21:333 2019


DpWpCheck: dyn W18, pid 29076 no longer needed, terminate now

Mon Sep 16 01:14:22:241 2019


DpHdlDeadWp: W18 (pid=29076) terminated automatically

Mon Sep 16 01:17:21:339 2019


DpWpCheck: dyn W19, pid 3822 no longer needed, terminate now

Mon Sep 16 01:17:21:643 2019


DpHdlDeadWp: W19 (pid=3822) terminated automatically

Mon Sep 16 01:17:41:340 2019


DpWpCheck: dyn W17, pid 4127 no longer needed, terminate now
DpWpCheck: dyn W20, pid 4121 no longer needed, terminate now

Mon Sep 16 01:17:41:676 2019


DpHdlDeadWp: W17 (pid=4127) terminated automatically

Mon Sep 16 01:17:43:039 2019


DpHdlDeadWp: W20 (pid=4121) terminated automatically

Mon Sep 16 01:19:59:684 2019


DpWpDynCreate: created new work process W7-6484

Mon Sep 16 01:22:18:053 2019


DpWpDynCreate: created new work process W18-7359

Mon Sep 16 01:25:00:395 2019


DpHdlDeadWp: W7 (pid=6484) terminated automatically

Mon Sep 16 01:27:19:172 2019


DpHdlDeadWp: W18 (pid=7359) terminated automatically

Mon Sep 16 01:28:59:505 2019


DpWpDynCreate: created new work process W19-9432

Mon Sep 16 01:34:01:989 2019


DpHdlDeadWp: W19 (pid=9432) terminated automatically

Mon Sep 16 01:37:00:326 2019


DpWpDynCreate: created new work process W17-12039

Mon Sep 16 01:40:43:758 2019


DpHdlDeadWp: W9 (pid=23323) terminated automatically
DpWpDynCreate: created new work process W9-13434

Mon Sep 16 01:42:01:831 2019


DpHdlDeadWp: W17 (pid=12039) terminated automatically

Mon Sep 16 01:42:15:909 2019


DpWpDynCreate: created new work process W20-13818

Mon Sep 16 01:42:22:780 2019


DpWpDynCreate: created new work process W7-13850

Mon Sep 16 01:47:19:086 2019


DpWpDynCreate: created new work process W18-15518

Mon Sep 16 01:47:21:387 2019


DpWpCheck: dyn W20, pid 13818 no longer needed, terminate now

Mon Sep 16 01:47:21:556 2019


DpHdlDeadWp: W20 (pid=13818) terminated automatically

Mon Sep 16 01:47:25:342 2019


DpHdlDeadWp: W7 (pid=13850) terminated automatically

Mon Sep 16 01:51:59:685 2019


DpWpDynCreate: created new work process W19-17114

Mon Sep 16 01:52:20:953 2019


DpHdlDeadWp: W18 (pid=15518) terminated automatically

Mon Sep 16 01:55:18:067 2019


DpWpDynCreate: created new work process W17-18078

Mon Sep 16 01:57:01:227 2019


DpHdlDeadWp: W19 (pid=17114) terminated automatically

Mon Sep 16 01:58:07:006 2019


DpWpDynCreate: created new work process W20-18996

Mon Sep 16 02:00:21:413 2019


DpWpCheck: dyn W17, pid 18078 no longer needed, terminate now

Mon Sep 16 02:00:21:780 2019


DpHdlDeadWp: W17 (pid=18078) terminated automatically

Mon Sep 16 02:03:21:420 2019


DpWpCheck: dyn W20, pid 18996 no longer needed, terminate now

Mon Sep 16 02:03:21:978 2019


DpHdlDeadWp: W20 (pid=18996) terminated automatically

Mon Sep 16 02:06:00:420 2019


DpWpDynCreate: created new work process W7-26877
Mon Sep 16 02:06:01:121 2019
DpWpDynCreate: created new work process W18-26880

Mon Sep 16 02:11:03:501 2019


DpHdlDeadWp: W7 (pid=26877) terminated automatically
DpWpCheck: dyn W18, pid 26880 no longer needed, terminate now

Mon Sep 16 02:11:04:751 2019


DpHdlDeadWp: W18 (pid=26880) terminated automatically

Mon Sep 16 02:13:00:754 2019


DpWpDynCreate: created new work process W19-17414

Mon Sep 16 02:18:01:515 2019


DpHdlDeadWp: W19 (pid=17414) terminated automatically

Mon Sep 16 02:22:16:738 2019


DpWpDynCreate: created new work process W17-22830

Mon Sep 16 02:22:16:982 2019


DpWpDynCreate: created new work process W20-22831

Mon Sep 16 02:27:10:064 2019


DpWpDynCreate: created new work process W7-24673

Mon Sep 16 02:27:17:569 2019


DpHdlDeadWp: W17 (pid=22830) terminated automatically
DpHdlDeadWp: W20 (pid=22831) terminated automatically

Mon Sep 16 02:28:00:659 2019


DpWpDynCreate: created new work process W18-24941

Mon Sep 16 02:32:21:540 2019


DpWpCheck: dyn W7, pid 24673 no longer needed, terminate now

Mon Sep 16 02:32:21:923 2019


DpHdlDeadWp: W7 (pid=24673) terminated automatically

Mon Sep 16 02:33:02:028 2019


DpHdlDeadWp: W18 (pid=24941) terminated automatically

Mon Sep 16 02:33:59:694 2019


DpWpDynCreate: created new work process W19-26864

Mon Sep 16 02:34:03:572 2019


DpWpDynCreate: created new work process W17-26868

Mon Sep 16 02:37:11:562 2019


DpWpDynCreate: created new work process W20-27880

Mon Sep 16 02:39:01:550 2019


DpWpCheck: dyn W19, pid 26864 no longer needed, terminate now

Mon Sep 16 02:39:01:656 2019


DpHdlDeadWp: W19 (pid=26864) terminated automatically

Mon Sep 16 02:39:21:551 2019


DpWpCheck: dyn W17, pid 26868 no longer needed, terminate now
Mon Sep 16 02:39:22:331 2019
DpHdlDeadWp: W17 (pid=26868) terminated automatically

Mon Sep 16 02:42:18:576 2019


DpWpDynCreate: created new work process W7-29505

Mon Sep 16 02:42:21:555 2019


DpWpCheck: dyn W20, pid 27880 no longer needed, terminate now

Mon Sep 16 02:42:22:580 2019


DpHdlDeadWp: W20 (pid=27880) terminated automatically

Mon Sep 16 02:47:21:565 2019


DpWpCheck: dyn W7, pid 29505 no longer needed, terminate now

Mon Sep 16 02:47:21:868 2019


DpHdlDeadWp: W7 (pid=29505) terminated automatically

Mon Sep 16 02:48:59:679 2019


DpWpDynCreate: created new work process W18-31640

Mon Sep 16 02:52:15:597 2019


DpWpDynCreate: created new work process W19-32645

Mon Sep 16 02:54:17:760 2019


DpHdlDeadWp: W18 (pid=31640) terminated automatically

Mon Sep 16 02:56:06:083 2019


DpWpDynCreate: created new work process W17-1503

Mon Sep 16 02:57:21:582 2019


DpWpCheck: dyn W19, pid 32645 no longer needed, terminate now

Mon Sep 16 02:57:21:978 2019


DpHdlDeadWp: W19 (pid=32645) terminated automatically

Mon Sep 16 03:01:21:589 2019


DpWpCheck: dyn W17, pid 1503 no longer needed, terminate now

Mon Sep 16 03:01:22:343 2019


DpHdlDeadWp: W17 (pid=1503) terminated automatically

Mon Sep 16 03:02:23:076 2019


DpWpDynCreate: created new work process W20-7183

Mon Sep 16 03:07:01:293 2019


DpWpDynCreate: created new work process W7-22067

Mon Sep 16 03:07:41:599 2019


DpWpCheck: dyn W20, pid 7183 no longer needed, terminate now

Mon Sep 16 03:07:42:663 2019


DpHdlDeadWp: W20 (pid=7183) terminated automatically

Mon Sep 16 03:12:02:809 2019


DpHdlDeadWp: W7 (pid=22067) terminated automatically

Mon Sep 16 03:12:24:605 2019


DpWpDynCreate: created new work process W18-2673

Mon Sep 16 03:17:41:614 2019


DpWpCheck: dyn W18, pid 2673 no longer needed, terminate now

Mon Sep 16 03:17:42:202 2019


DpHdlDeadWp: W18 (pid=2673) terminated automatically

Mon Sep 16 03:18:18:336 2019


DpWpDynCreate: created new work process W19-5002

Mon Sep 16 03:18:19:916 2019


DpWpDynCreate: created new work process W17-5014

Mon Sep 16 03:23:21:916 2019


DpWpCheck: dyn W17, pid 5014 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=5002) terminated automatically
DpWpCheck: dyn W17, pid 5014 no longer needed, terminate now

Mon Sep 16 03:23:22:796 2019


DpHdlDeadWp: W17 (pid=5014) terminated automatically

Mon Sep 16 03:24:12:732 2019


DpWpDynCreate: created new work process W20-7052

Mon Sep 16 03:29:21:927 2019


DpWpCheck: dyn W20, pid 7052 no longer needed, terminate now

Mon Sep 16 03:29:22:202 2019


DpHdlDeadWp: W20 (pid=7052) terminated automatically

Mon Sep 16 03:34:01:763 2019


DpWpDynCreate: created new work process W7-10862

Mon Sep 16 03:35:06:128 2019


DpWpDynCreate: created new work process W18-11218

Mon Sep 16 03:39:02:753 2019


DpHdlDeadWp: W7 (pid=10862) terminated automatically

Mon Sep 16 03:40:21:947 2019


DpWpCheck: dyn W18, pid 11218 no longer needed, terminate now

Mon Sep 16 03:40:22:724 2019


DpHdlDeadWp: W18 (pid=11218) terminated automatically

Mon Sep 16 03:41:01:809 2019


DpWpDynCreate: created new work process W19-12993

Mon Sep 16 03:46:03:059 2019


DpHdlDeadWp: W19 (pid=12993) terminated automatically
DpWpDynCreate: created new work process W17-15003

Mon Sep 16 03:49:01:540 2019


DpWpDynCreate: created new work process W20-15906

Mon Sep 16 03:51:04:415 2019


DpHdlDeadWp: W17 (pid=15003) terminated automatically
Mon Sep 16 03:54:21:973 2019
DpWpCheck: dyn W20, pid 15906 no longer needed, terminate now

Mon Sep 16 03:54:22:618 2019


DpHdlDeadWp: W20 (pid=15906) terminated automatically

Mon Sep 16 04:06:32:441 2019


DpWpDynCreate: created new work process W7-7012

Mon Sep 16 04:09:08:146 2019


DpWpDynCreate: created new work process W18-18164
DpWpDynCreate: created new work process W19-18165

Mon Sep 16 04:11:42:009 2019


DpWpCheck: dyn W7, pid 7012 no longer needed, terminate now

Mon Sep 16 04:11:42:515 2019


DpHdlDeadWp: W7 (pid=7012) terminated automatically

Mon Sep 16 04:14:11:362 2019


DpHdlDeadWp: W18 (pid=18164) terminated automatically
DpHdlDeadWp: W19 (pid=18165) terminated automatically

Mon Sep 16 04:15:05:502 2019


DpWpDynCreate: created new work process W17-20799

Mon Sep 16 04:16:30:130 2019


DpWpDynCreate: created new work process W20-21362

Mon Sep 16 04:20:22:029 2019


DpWpCheck: dyn W17, pid 20799 no longer needed, terminate now

Mon Sep 16 04:20:22:794 2019


DpHdlDeadWp: W17 (pid=20799) terminated automatically

Mon Sep 16 04:21:32:559 2019


DpHdlDeadWp: W20 (pid=21362) terminated automatically

Mon Sep 16 04:24:24:191 2019


DpWpDynCreate: created new work process W7-24242

Mon Sep 16 04:29:42:046 2019


DpWpCheck: dyn W7, pid 24242 no longer needed, terminate now

Mon Sep 16 04:29:42:960 2019


DpHdlDeadWp: W7 (pid=24242) terminated automatically

Mon Sep 16 04:40:04:810 2019


DpWpDynCreate: created new work process W18-29590

Mon Sep 16 04:45:22:074 2019


DpWpCheck: dyn W18, pid 29590 no longer needed, terminate now

Mon Sep 16 04:45:23:008 2019


DpHdlDeadWp: W18 (pid=29590) terminated automatically

Mon Sep 16 04:46:26:333 2019


DpHdlDeadWp: W0 (pid=32119) terminated automatically
DpWpDynCreate: created new work process W0-31517
Mon Sep 16 04:48:07:836 2019
DpWpDynCreate: created new work process W19-31995

Mon Sep 16 04:53:22:091 2019


DpWpCheck: dyn W19, pid 31995 no longer needed, terminate now

Mon Sep 16 04:53:22:731 2019


DpHdlDeadWp: W19 (pid=31995) terminated automatically

Mon Sep 16 04:56:09:266 2019


*** ERROR => NiIRead: invalid data (0x300002f/0x8800;mode=0;hdl
55;peer=194.61.24.56:1005;local=3200) [nixxi.cpp 5226]

Mon Sep 16 05:00:22:104 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for T7_U9763
(60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T7_U9763_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:18:46|5 |
RSABAPPROGRAM |high| |
|SE38 |
DpHdlSoftCancel: cancel request for T7_U9763_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Mon Sep 16 05:01:18:706 2019


DpHdlDeadWp: W12 (pid=7911) terminated automatically
DpWpDynCreate: created new work process W12-4708

Mon Sep 16 05:03:04:529 2019


DpWpDynCreate: created new work process W17-12050

Mon Sep 16 05:08:12:788 2019


DpHdlDeadWp: W17 (pid=12050) terminated automatically

Mon Sep 16 05:09:59:889 2019


DpWpDynCreate: created new work process W20-2939

Mon Sep 16 05:15:02:136 2019


DpWpCheck: dyn W20, pid 2939 no longer needed, terminate now

Mon Sep 16 05:15:02:408 2019


DpHdlDeadWp: W20 (pid=2939) terminated automatically

Mon Sep 16 05:16:00:571 2019


DpWpDynCreate: created new work process W7-4986

Mon Sep 16 05:18:07:852 2019


DpWpDynCreate: created new work process W18-5926

Mon Sep 16 05:21:01:410 2019


DpHdlDeadWp: W7 (pid=4986) terminated automatically

Mon Sep 16 05:23:12:668 2019


DpHdlDeadWp: W18 (pid=5926) terminated automatically

Mon Sep 16 05:29:59:786 2019


DpWpDynCreate: created new work process W19-10209

Mon Sep 16 05:35:01:368 2019


DpHdlDeadWp: W19 (pid=10209) terminated automatically
Mon Sep 16 05:41:59:202 2019
DpWpDynCreate: created new work process W17-13960

Mon Sep 16 05:47:00:383 2019


DpHdlDeadWp: W17 (pid=13960) terminated automatically

Mon Sep 16 05:56:59:258 2019


DpWpDynCreate: created new work process W20-18903

Mon Sep 16 05:58:59:340 2019


DpWpDynCreate: created new work process W7-19593

Mon Sep 16 06:02:01:496 2019


DpHdlDeadWp: W20 (pid=18903) terminated automatically

Mon Sep 16 06:03:56:240 2019


DpWpDynCreate: created new work process W18-29437

Mon Sep 16 06:04:00:706 2019


DpHdlDeadWp: W7 (pid=19593) terminated automatically

Mon Sep 16 06:09:02:231 2019


DpWpCheck: dyn W18, pid 29437 no longer needed, terminate now

Mon Sep 16 06:09:02:816 2019


DpHdlDeadWp: W18 (pid=29437) terminated automatically

Mon Sep 16 06:11:00:256 2019


DpWpDynCreate: created new work process W19-19102

Mon Sep 16 06:11:00:788 2019


DpHdlDeadWp: W13 (pid=25178) terminated automatically
DpWpDynCreate: created new work process W13-19105

Mon Sep 16 06:12:45:251 2019


DpHdlDeadWp: W9 (pid=13434) terminated automatically
DpWpDynCreate: created new work process W9-19541

Mon Sep 16 06:16:02:247 2019


DpWpCheck: dyn W19, pid 19102 no longer needed, terminate now

Mon Sep 16 06:16:02:391 2019


DpHdlDeadWp: W19 (pid=19102) terminated automatically

Mon Sep 16 06:17:30:323 2019


DpWpDynCreate: created new work process W17-21454

Mon Sep 16 06:17:30:605 2019


DpHdlDeadWp: W11 (pid=13212) terminated automatically
DpWpDynCreate: created new work process W11-21457

Mon Sep 16 06:20:32:174 2019


DpHdlDeadWp: W9 (pid=19541) terminated automatically
DpWpDynCreate: created new work process W9-22390

Mon Sep 16 06:22:42:270 2019


DpWpCheck: dyn W17, pid 21454 no longer needed, terminate now
Mon Sep 16 06:22:42:967 2019
DpHdlDeadWp: W17 (pid=21454) terminated automatically

Mon Sep 16 06:23:51:328 2019


DpWpDynCreate: created new work process W20-23451

Mon Sep 16 06:24:58:495 2019


DpHdlDeadWp: W11 (pid=21457) terminated automatically
DpWpDynCreate: created new work process W11-24047
DpHdlDeadWp: W12 (pid=4708) terminated automatically
DpWpDynCreate: created new work process W12-24048

Mon Sep 16 06:24:59:400 2019


DpWpDynCreate: created new work process W7-24053

Mon Sep 16 06:27:19:071 2019


DpHdlDeadWp: W11 (pid=24047) terminated automatically
DpWpDynCreate: created new work process W11-25114

Mon Sep 16 06:29:02:281 2019


DpWpCheck: dyn W20, pid 23451 no longer needed, terminate now

Mon Sep 16 06:29:03:143 2019


DpHdlDeadWp: W20 (pid=23451) terminated automatically

Mon Sep 16 06:29:17:479 2019


DpHdlDeadWp: W9 (pid=22390) terminated automatically
DpWpDynCreate: created new work process W9-25707

Mon Sep 16 06:30:02:282 2019


DpWpCheck: dyn W7, pid 24053 no longer needed, terminate now

Mon Sep 16 06:30:03:340 2019


DpHdlDeadWp: W7 (pid=24053) terminated automatically

Mon Sep 16 06:33:16:717 2019


DpWpDynCreate: created new work process W18-27191

Mon Sep 16 06:33:17:675 2019


DpHdlDeadWp: W13 (pid=19105) terminated automatically
DpWpDynCreate: created new work process W13-27200

Mon Sep 16 06:35:08:567 2019


DpHdlDeadWp: W12 (pid=24048) terminated automatically
DpWpDynCreate: created new work process W12-27830

Mon Sep 16 06:37:25:204 2019


DpHdlDeadWp: W11 (pid=25114) terminated automatically
DpWpDynCreate: created new work process W11-28558

Mon Sep 16 06:38:22:304 2019


DpWpCheck: dyn W18, pid 27191 no longer needed, terminate now

Mon Sep 16 06:38:22:996 2019


DpHdlDeadWp: W18 (pid=27191) terminated automatically

Mon Sep 16 06:39:51:675 2019


DpWpDynCreate: created new work process W19-29312
Mon Sep 16 06:39:51:859 2019
DpHdlDeadWp: W13 (pid=27200) terminated automatically
DpWpDynCreate: created new work process W13-29313

Mon Sep 16 06:42:21:549 2019


DpHdlDeadWp: W12 (pid=27830) terminated automatically
DpWpDynCreate: created new work process W12-30145

Mon Sep 16 06:45:02:318 2019


DpWpCheck: dyn W19, pid 29312 no longer needed, terminate now

Mon Sep 16 06:45:02:714 2019


DpHdlDeadWp: W19 (pid=29312) terminated automatically

Mon Sep 16 06:45:34:478 2019


DpWpDynCreate: created new work process W17-31459

Mon Sep 16 06:45:34:990 2019


DpHdlDeadWp: W13 (pid=29313) terminated automatically
DpWpDynCreate: created new work process W13-31460

Mon Sep 16 06:49:37:598 2019


DpHdlDeadWp: W12 (pid=30145) terminated automatically
DpWpDynCreate: created new work process W12-32748

Mon Sep 16 06:50:42:328 2019


DpWpCheck: dyn W17, pid 31459 no longer needed, terminate now

Mon Sep 16 06:50:42:669 2019


DpHdlDeadWp: W17 (pid=31459) terminated automatically

Mon Sep 16 06:51:15:662 2019


DpWpDynCreate: created new work process W20-805

Mon Sep 16 06:51:16:490 2019


DpHdlDeadWp: W9 (pid=25707) terminated automatically
DpWpDynCreate: created new work process W9-808

Mon Sep 16 06:51:39:005 2019


DpHdlDeadWp: W12 (pid=32748) terminated automatically
DpWpDynCreate: created new work process W12-895

Mon Sep 16 06:56:22:338 2019


DpWpCheck: dyn W20, pid 805 no longer needed, terminate now

Mon Sep 16 06:56:23:220 2019


DpHdlDeadWp: W20 (pid=805) terminated automatically

Mon Sep 16 06:57:20:596 2019


DpWpDynCreate: created new work process W7-3074

Mon Sep 16 06:59:11:898 2019


DpWpDynCreate: created new work process W18-3779

Mon Sep 16 06:59:12:324 2019


DpHdlDeadWp: W11 (pid=28558) terminated automatically
DpWpDynCreate: created new work process W11-3780

Mon Sep 16 06:59:18:660 2019


DpHdlDeadWp: W12 (pid=895) terminated automatically
DpWpDynCreate: created new work process W12-3786

Mon Sep 16 07:02:22:349 2019


DpWpCheck: dyn W7, pid 3074 no longer needed, terminate now

Mon Sep 16 07:02:22:581 2019


DpHdlDeadWp: W7 (pid=3074) terminated automatically

Mon Sep 16 07:03:46:075 2019


DpHdlDeadWp: W9 (pid=808) terminated automatically
DpWpDynCreate: created new work process W9-11945

Mon Sep 16 07:04:22:352 2019


DpWpCheck: dyn W18, pid 3779 no longer needed, terminate now

Mon Sep 16 07:04:22:774 2019


DpHdlDeadWp: W18 (pid=3779) terminated automatically

Mon Sep 16 07:06:05:304 2019


DpHdlDeadWp: W11 (pid=3780) terminated automatically
DpWpDynCreate: created new work process W11-20107

Mon Sep 16 07:06:07:302 2019


DpWpDynCreate: created new work process W19-20120

Mon Sep 16 07:06:07:909 2019


DpHdlDeadWp: W12 (pid=3786) terminated automatically
DpWpDynCreate: created new work process W12-20127

Mon Sep 16 07:11:22:365 2019


DpWpCheck: dyn W19, pid 20120 no longer needed, terminate now

Mon Sep 16 07:11:22:559 2019


DpHdlDeadWp: W19 (pid=20120) terminated automatically

Mon Sep 16 07:12:44:869 2019


DpWpDynCreate: created new work process W17-4069

Mon Sep 16 07:12:45:989 2019


DpHdlDeadWp: W9 (pid=11945) terminated automatically
DpWpDynCreate: created new work process W9-4074

Mon Sep 16 07:13:28:410 2019


DpHdlDeadWp: W13 (pid=31460) terminated automatically
DpWpDynCreate: created new work process W13-4288

Mon Sep 16 07:18:02:377 2019


DpWpCheck: dyn W17, pid 4069 no longer needed, terminate now

Mon Sep 16 07:18:02:608 2019


DpHdlDeadWp: W17 (pid=4069) terminated automatically

Mon Sep 16 07:19:21:864 2019


DpWpDynCreate: created new work process W20-6416

Mon Sep 16 07:19:22:197 2019


DpHdlDeadWp: W11 (pid=20107) terminated automatically
DpWpDynCreate: created new work process W11-6418
Mon Sep 16 07:21:51:198 2019
DpHdlDeadWp: W9 (pid=4074) terminated automatically
DpWpDynCreate: created new work process W9-7392

Mon Sep 16 07:23:26:438 2019


DpHdlDeadWp: W12 (pid=20127) terminated automatically
DpWpDynCreate: created new work process W12-8014

Mon Sep 16 07:24:00:066 2019


DpWpDynCreate: created new work process W7-8203

Mon Sep 16 07:24:22:385 2019


DpWpCheck: dyn W20, pid 6416 no longer needed, terminate now

Mon Sep 16 07:24:22:491 2019


DpHdlDeadWp: W20 (pid=6416) terminated automatically

Mon Sep 16 07:29:11:061 2019


DpHdlDeadWp: W7 (pid=8203) terminated automatically

Mon Sep 16 07:36:00:964 2019


DpWpDynCreate: created new work process W18-12319

Mon Sep 16 07:41:02:413 2019


DpWpCheck: dyn W18, pid 12319 no longer needed, terminate now

Mon Sep 16 07:41:02:735 2019


DpHdlDeadWp: W18 (pid=12319) terminated automatically

Mon Sep 16 07:46:00:045 2019


DpWpDynCreate: created new work process W19-15514

Mon Sep 16 07:51:01:772 2019


DpHdlDeadWp: W19 (pid=15514) terminated automatically
DpWpDynCreate: created new work process W17-17190

Mon Sep 16 07:56:02:554 2019


DpHdlDeadWp: W17 (pid=17190) terminated automatically

Mon Sep 16 07:56:04:519 2019


DpHdlDeadWp: W10 (pid=870) terminated automatically
DpWpDynCreate: created new work process W10-18774

Mon Sep 16 07:59:06:376 2019


DpWpDynCreate: created new work process W20-19759

Mon Sep 16 08:02:02:452 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T14_U15326 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T14_U15326_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |07:00:16|7 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T14_U15326_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Mon Sep 16 08:04:22:456 2019


DpWpCheck: dyn W20, pid 19759 no longer needed, terminate now
Mon Sep 16 08:04:22:643 2019
DpHdlDeadWp: W20 (pid=19759) terminated automatically

Mon Sep 16 08:05:17:011 2019


DpWpDynCreate: created new work process W7-3495

Mon Sep 16 08:10:22:466 2019


DpWpCheck: dyn W7, pid 3495 no longer needed, terminate now

Mon Sep 16 08:10:22:701 2019


DpHdlDeadWp: W7 (pid=3495) terminated automatically

Mon Sep 16 08:12:04:063 2019


DpWpDynCreate: created new work process W18-19486

Mon Sep 16 08:17:06:458 2019


DpHdlDeadWp: W18 (pid=19486) terminated automatically

Mon Sep 16 08:35:11:771 2019


DpWpDynCreate: created new work process W19-27679

Mon Sep 16 08:40:07:203 2019


DpWpDynCreate: created new work process W17-29481

Mon Sep 16 08:40:14:115 2019


DpHdlDeadWp: W19 (pid=27679) terminated automatically

Mon Sep 16 08:45:22:525 2019


DpWpCheck: dyn W17, pid 29481 no longer needed, terminate now

Mon Sep 16 08:45:22:786 2019


DpHdlDeadWp: W17 (pid=29481) terminated automatically

Mon Sep 16 08:48:00:122 2019


DpWpDynCreate: created new work process W20-32022
DpWpDynCreate: created new work process W7-32029

Mon Sep 16 08:51:05:464 2019


DpWpDynCreate: created new work process W18-774

Mon Sep 16 08:53:01:758 2019


DpHdlDeadWp: W20 (pid=32022) terminated automatically

Mon Sep 16 08:53:02:231 2019


DpHdlDeadWp: W7 (pid=32029) terminated automatically

Mon Sep 16 08:56:22:543 2019


DpWpCheck: dyn W18, pid 774 no longer needed, terminate now

Mon Sep 16 08:56:23:156 2019


DpHdlDeadWp: W18 (pid=774) terminated automatically

Mon Sep 16 08:58:00:243 2019


DpWpDynCreate: created new work process W19-3213

Mon Sep 16 08:58:05:818 2019


DpWpDynCreate: created new work process W17-3306

Mon Sep 16 08:59:02:549 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T48_U12999 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T48_U12999_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |08:17:42|16 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T48_U12999_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Mon Sep 16 09:03:01:282 2019


DpHdlDeadWp: W19 (pid=3213) terminated automatically

Mon Sep 16 09:03:04:749 2019


DpWpDynCreate: created new work process W20-10594

Mon Sep 16 09:03:22:555 2019


DpWpCheck: dyn W17, pid 3306 no longer needed, terminate now

Mon Sep 16 09:03:22:713 2019


DpHdlDeadWp: W17 (pid=3306) terminated automatically

Mon Sep 16 09:06:07:668 2019


DpWpDynCreate: created new work process W7-21806

Mon Sep 16 09:08:05:461 2019


DpHdlDeadWp: W20 (pid=10594) terminated automatically

Mon Sep 16 09:10:00:516 2019


DpWpDynCreate: created new work process W18-2723

Mon Sep 16 09:11:22:571 2019


DpWpCheck: dyn W7, pid 21806 no longer needed, terminate now
DpHdlDeadWp: W7 (pid=21806) terminated automatically

Mon Sep 16 09:12:07:740 2019


DpWpDynCreate: created new work process W19-3477

Mon Sep 16 09:15:01:599 2019


DpHdlDeadWp: W18 (pid=2723) terminated automatically

Mon Sep 16 09:17:22:578 2019


DpWpCheck: dyn W19, pid 3477 no longer needed, terminate now

Mon Sep 16 09:17:23:346 2019


DpHdlDeadWp: W19 (pid=3477) terminated automatically

Mon Sep 16 09:24:00:156 2019


DpWpDynCreate: created new work process W17-7767

Mon Sep 16 09:29:01:328 2019


DpHdlDeadWp: W17 (pid=7767) terminated automatically

Mon Sep 16 09:35:11:814 2019


DpWpDynCreate: created new work process W20-11768

Mon Sep 16 09:35:13:088 2019


DpWpDynCreate: created new work process W7-11772

Mon Sep 16 09:40:12:873 2019


DpHdlDeadWp: W20 (pid=11768) terminated automatically
Mon Sep 16 09:40:14:547 2019
DpHdlDeadWp: W7 (pid=11772) terminated automatically

Mon Sep 16 09:49:00:159 2019


DpWpDynCreate: created new work process W18-16039

Mon Sep 16 09:54:02:820 2019


DpHdlDeadWp: W18 (pid=16039) terminated automatically

Mon Sep 16 09:56:04:358 2019


DpWpDynCreate: created new work process W19-18544

Mon Sep 16 10:00:00:638 2019


DpWpDynCreate: created new work process W17-19602

Mon Sep 16 10:00:42:830 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T46_U13003 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T46_U13003_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |09:19:04|16 |
SAPMCRFC |high| |
|SM59 |
DpHdlSoftCancel: cancel request for T46_U13003_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Mon Sep 16 10:01:22:831 2019


DpWpCheck: dyn W19, pid 18544 no longer needed, terminate now

Mon Sep 16 10:01:23:613 2019


DpHdlDeadWp: W19 (pid=18544) terminated automatically

Mon Sep 16 10:05:01:254 2019


DpHdlDeadWp: W17 (pid=19602) terminated automatically

Mon Sep 16 10:06:00:015 2019


DpWpDynCreate: created new work process W20-5887

Mon Sep 16 10:10:08:346 2019


DpWpDynCreate: created new work process W7-18569

Mon Sep 16 10:11:01:847 2019


DpHdlDeadWp: W20 (pid=5887) terminated automatically

Mon Sep 16 10:15:09:203 2019


DpHdlDeadWp: W7 (pid=18569) terminated automatically

Mon Sep 16 10:20:08:745 2019


DpWpDynCreate: created new work process W18-22191

Mon Sep 16 10:25:11:109 2019


DpHdlDeadWp: W18 (pid=22191) terminated automatically

Mon Sep 16 10:25:15:463 2019


DpWpDynCreate: created new work process W19-24062

Mon Sep 16 10:30:22:876 2019


DpWpCheck: dyn W19, pid 24062 no longer needed, terminate now

Mon Sep 16 10:30:23:430 2019


DpHdlDeadWp: W19 (pid=24062) terminated automatically

Mon Sep 16 10:33:00:195 2019


DpWpDynCreate: created new work process W17-26704

Mon Sep 16 10:35:12:313 2019


DpWpDynCreate: created new work process W20-27384

Mon Sep 16 10:38:01:966 2019


DpHdlDeadWp: W17 (pid=26704) terminated automatically

Mon Sep 16 10:40:22:895 2019


DpWpCheck: dyn W20, pid 27384 no longer needed, terminate now

Mon Sep 16 10:40:23:167 2019


DpHdlDeadWp: W20 (pid=27384) terminated automatically

Mon Sep 16 10:41:01:212 2019


DpWpDynCreate: created new work process W7-29336

Mon Sep 16 10:46:02:904 2019


DpWpCheck: dyn W7, pid 29336 no longer needed, terminate now

Mon Sep 16 10:46:03:731 2019


DpHdlDeadWp: W7 (pid=29336) terminated automatically

Mon Sep 16 10:50:09:461 2019


DpWpDynCreate: created new work process W18-32384
DpWpDynCreate: created new work process W19-32385

Mon Sep 16 10:53:08:834 2019


DpWpDynCreate: created new work process W17-886

Mon Sep 16 10:55:10:314 2019


DpHdlDeadWp: W19 (pid=32385) terminated automatically

Mon Sep 16 10:55:18:361 2019


DpHdlDeadWp: W18 (pid=32384) terminated automatically

Mon Sep 16 10:58:10:250 2019


DpHdlDeadWp: W17 (pid=886) terminated automatically

Mon Sep 16 11:00:40:570 2019


DpWpDynCreate: created new work process W20-3713

Mon Sep 16 11:04:00:262 2019


DpWpDynCreate: created new work process W7-13818

Mon Sep 16 11:04:00:464 2019


DpWpDynCreate: created new work process W19-13845

Mon Sep 16 11:05:42:938 2019


DpWpCheck: dyn W20, pid 3713 no longer needed, terminate now

Mon Sep 16 11:05:43:175 2019


DpHdlDeadWp: W20 (pid=3713) terminated automatically

Mon Sep 16 11:05:59:910 2019


DpWpDynCreate: created new work process W18-18988
DpWpDynCreate: created new work process W17-18993

Mon Sep 16 11:09:02:072 2019


DpHdlDeadWp: W7 (pid=13818) terminated automatically

Mon Sep 16 11:09:03:878 2019


DpHdlDeadWp: W19 (pid=13845) terminated automatically

Mon Sep 16 11:11:01:625 2019


DpWpCheck: dyn W17, pid 18993 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=18988) terminated automatically

Mon Sep 16 11:11:02:050 2019


DpHdlDeadWp: W17 (pid=18993) terminated automatically

Mon Sep 16 11:20:00:756 2019


DpWpDynCreate: created new work process W20-6112

Mon Sep 16 11:25:01:230 2019


DpHdlDeadWp: W20 (pid=6112) terminated automatically

Mon Sep 16 11:25:09:688 2019


DpWpDynCreate: created new work process W7-7792

Mon Sep 16 11:30:10:360 2019


DpHdlDeadWp: W7 (pid=7792) terminated automatically

Mon Sep 16 11:30:22:986 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T14_U4653 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T14_U4653_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |11:09:03|6 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T14_U4653_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Mon Sep 16 11:31:03:600 2019


DpWpDynCreate: created new work process W19-10214

Mon Sep 16 11:35:08:940 2019


DpWpDynCreate: created new work process W18-11369

Mon Sep 16 11:35:09:051 2019


DpWpDynCreate: created new work process W17-11370

Mon Sep 16 11:36:04:792 2019


DpHdlDeadWp: W19 (pid=10214) terminated automatically

Mon Sep 16 11:40:23:004 2019


DpWpCheck: dyn W17, pid 11370 no longer needed, terminate now
DpWpCheck: dyn W18, pid 11369 no longer needed, terminate now

Mon Sep 16 11:40:24:195 2019


DpHdlDeadWp: W17 (pid=11370) terminated automatically
DpHdlDeadWp: W18 (pid=11369) terminated automatically

Mon Sep 16 11:41:07:916 2019


DpWpDynCreate: created new work process W20-13368
Mon Sep 16 11:46:08:514 2019
DpHdlDeadWp: W20 (pid=13368) terminated automatically

Mon Sep 16 11:47:59:977 2019


DpWpDynCreate: created new work process W7-15600

Mon Sep 16 11:53:00:654 2019


DpHdlDeadWp: W7 (pid=15600) terminated automatically

Mon Sep 16 11:56:11:193 2019


DpWpDynCreate: created new work process W19-18321

Mon Sep 16 12:01:23:038 2019


DpWpCheck: dyn W19, pid 18321 no longer needed, terminate now

Mon Sep 16 12:01:25:057 2019


DpHdlDeadWp: W19 (pid=18321) terminated automatically

Mon Sep 16 12:04:00:014 2019


DpWpDynCreate: created new work process W17-25526

Mon Sep 16 12:09:01:696 2019


DpHdlDeadWp: W17 (pid=25526) terminated automatically

Mon Sep 16 12:12:00:141 2019


DpWpDynCreate: created new work process W18-18803

Mon Sep 16 12:17:03:071 2019


DpWpCheck: dyn W18, pid 18803 no longer needed, terminate now

Mon Sep 16 12:17:04:111 2019


DpHdlDeadWp: W18 (pid=18803) terminated automatically

Mon Sep 16 12:23:04:523 2019


DpWpDynCreate: created new work process W20-22757

Mon Sep 16 12:25:59:833 2019


DpWpDynCreate: created new work process W7-23748
DpWpDynCreate: created new work process W19-23751

Mon Sep 16 12:27:00:151 2019


DpWpDynCreate: created new work process W17-24026

Mon Sep 16 12:28:19:141 2019


DpHdlDeadWp: W20 (pid=22757) terminated automatically

Mon Sep 16 12:28:59:813 2019


DpWpDynCreate: created new work process W18-24875

Mon Sep 16 12:31:00:791 2019


DpHdlDeadWp: W19 (pid=23751) terminated automatically

Mon Sep 16 12:31:01:454 2019


DpHdlDeadWp: W7 (pid=23748) terminated automatically

Mon Sep 16 12:32:03:096 2019


DpWpCheck: dyn W17, pid 24026 no longer needed, terminate now

Mon Sep 16 12:32:03:908 2019


DpHdlDeadWp: W17 (pid=24026) terminated automatically

Mon Sep 16 12:34:00:847 2019


DpHdlDeadWp: W18 (pid=24875) terminated automatically

Mon Sep 16 12:56:09:185 2019


DpWpDynCreate: created new work process W20-1554

Mon Sep 16 12:56:12:284 2019


DpWpDynCreate: created new work process W19-1558

Mon Sep 16 13:01:23:144 2019


DpWpCheck: dyn W19, pid 1558 no longer needed, terminate now
DpWpCheck: dyn W20, pid 1554 no longer needed, terminate now

Mon Sep 16 13:01:23:398 2019


DpHdlDeadWp: W19 (pid=1558) terminated automatically
DpHdlDeadWp: W20 (pid=1554) terminated automatically

Mon Sep 16 13:05:48:746 2019


DpWpDynCreate: created new work process W7-21143

Mon Sep 16 13:11:00:241 2019


DpHdlDeadWp: W7 (pid=21143) terminated automatically

Mon Sep 16 13:11:12:413 2019


DpWpDynCreate: created new work process W17-2302

Mon Sep 16 13:16:13:557 2019


DpHdlDeadWp: W17 (pid=2302) terminated automatically

Mon Sep 16 13:21:08:742 2019


DpWpDynCreate: created new work process W18-5988

Mon Sep 16 13:26:09:188 2019


DpHdlDeadWp: W18 (pid=5988) terminated automatically

Mon Sep 16 13:26:09:386 2019


DpWpDynCreate: created new work process W19-7536

Mon Sep 16 13:27:00:020 2019


DpWpDynCreate: created new work process W20-7758
DpWpDynCreate: created new work process W7-7762

Mon Sep 16 13:31:23:199 2019


DpWpCheck: dyn W19, pid 7536 no longer needed, terminate now

Mon Sep 16 13:31:24:274 2019


DpHdlDeadWp: W19 (pid=7536) terminated automatically

Mon Sep 16 13:32:03:199 2019


DpWpCheck: dyn W7, pid 7762 no longer needed, terminate now
DpWpCheck: dyn W20, pid 7758 no longer needed, terminate now

Mon Sep 16 13:32:03:390 2019


DpHdlDeadWp: W7 (pid=7762) terminated automatically
DpHdlDeadWp: W20 (pid=7758) terminated automatically

Mon Sep 16 13:36:17:165 2019


DpWpDynCreate: created new work process W17-11042

Mon Sep 16 13:40:00:611 2019


DpWpDynCreate: created new work process W18-12415

Mon Sep 16 13:40:01:606 2019


DpWpDynCreate: created new work process W19-12482

Mon Sep 16 13:41:18:978 2019


DpHdlDeadWp: W17 (pid=11042) terminated automatically

Mon Sep 16 13:45:01:761 2019


DpHdlDeadWp: W18 (pid=12415) terminated automatically

Mon Sep 16 13:45:03:849 2019


DpHdlDeadWp: W19 (pid=12482) terminated automatically

Mon Sep 16 13:56:11:941 2019


DpWpDynCreate: created new work process W7-17630

Mon Sep 16 13:56:14:469 2019


DpWpDynCreate: created new work process W20-17648

Mon Sep 16 13:58:02:931 2019


DpHdlDeadWp: W9 (pid=7392) terminated automatically
DpWpDynCreate: created new work process W9-18255

Mon Sep 16 14:00:37:104 2019


DpHdlDeadWp: W12 (pid=8014) terminated automatically
DpWpDynCreate: created new work process W12-19470

Mon Sep 16 14:01:23:252 2019


DpWpCheck: dyn W7, pid 17630 no longer needed, terminate now
DpWpCheck: dyn W20, pid 17648 no longer needed, terminate now

Mon Sep 16 14:01:23:591 2019


DpHdlDeadWp: W7 (pid=17630) terminated automatically

Mon Sep 16 14:01:25:421 2019


DpHdlDeadWp: W20 (pid=17648) terminated automatically

Mon Sep 16 14:06:15:257 2019


DpWpDynCreate: created new work process W17-2808

Mon Sep 16 14:08:59:915 2019


DpWpDynCreate: created new work process W18-13040

Mon Sep 16 14:11:23:269 2019


DpWpCheck: dyn W17, pid 2808 no longer needed, terminate now

Mon Sep 16 14:11:23:882 2019


DpHdlDeadWp: W17 (pid=2808) terminated automatically

Mon Sep 16 14:14:00:848 2019


DpHdlDeadWp: W18 (pid=13040) terminated automatically

Mon Sep 16 14:21:08:969 2019


DpWpDynCreate: created new work process W19-21796
Mon Sep 16 14:25:00:732 2019
DpWpDynCreate: created new work process W7-22866

Mon Sep 16 14:26:07:928 2019


DpWpDynCreate: created new work process W20-23309

Mon Sep 16 14:26:23:298 2019


DpWpCheck: dyn W19, pid 21796 no longer needed, terminate now

Mon Sep 16 14:26:23:625 2019


DpHdlDeadWp: W19 (pid=21796) terminated automatically

Mon Sep 16 14:30:01:302 2019


DpHdlDeadWp: W7 (pid=22866) terminated automatically
DpWpDynCreate: created new work process W17-24840

Mon Sep 16 14:31:09:915 2019


DpHdlDeadWp: W20 (pid=23309) terminated automatically

Mon Sep 16 14:32:06:640 2019


DpWpDynCreate: created new work process W18-25690

Mon Sep 16 14:33:02:190 2019


DpWpDynCreate: created new work process W19-25971

Mon Sep 16 14:35:02:586 2019


DpHdlDeadWp: W17 (pid=24840) terminated automatically

Mon Sep 16 14:36:00:432 2019


DpWpDynCreate: created new work process W7-26808

Mon Sep 16 14:37:23:317 2019


DpWpCheck: dyn W18, pid 25690 no longer needed, terminate now

Mon Sep 16 14:37:24:225 2019


DpHdlDeadWp: W18 (pid=25690) terminated automatically

Mon Sep 16 14:38:04:274 2019


DpHdlDeadWp: W19 (pid=25971) terminated automatically

Mon Sep 16 14:41:01:597 2019


DpHdlDeadWp: W7 (pid=26808) terminated automatically

Mon Sep 16 14:41:04:303 2019


DpWpDynCreate: created new work process W20-28544

Mon Sep 16 14:41:11:260 2019


DpWpDynCreate: created new work process W17-28586

Mon Sep 16 14:46:07:915 2019


DpHdlDeadWp: W20 (pid=28544) terminated automatically

Mon Sep 16 14:46:11:972 2019


DpWpDynCreate: created new work process W18-30501

Mon Sep 16 14:46:25:497 2019


DpHdlDeadWp: W17 (pid=28586) terminated automatically

Mon Sep 16 14:48:01:237 2019


DpWpDynCreate: created new work process W19-30932

Mon Sep 16 14:51:23:339 2019


DpWpCheck: dyn W18, pid 30501 no longer needed, terminate now

Mon Sep 16 14:51:23:611 2019


DpHdlDeadWp: W18 (pid=30501) terminated automatically

Mon Sep 16 14:53:03:294 2019


DpHdlDeadWp: W19 (pid=30932) terminated automatically
DpWpDynCreate: created new work process W7-32651

Mon Sep 16 14:56:07:731 2019


DpWpDynCreate: created new work process W20-1302

Mon Sep 16 14:56:12:113 2019


DpWpDynCreate: created new work process W17-1377

Mon Sep 16 14:58:04:781 2019


DpHdlDeadWp: W7 (pid=32651) terminated automatically

Mon Sep 16 15:01:08:986 2019


DpHdlDeadWp: W20 (pid=1302) terminated automatically

Mon Sep 16 15:01:12:654 2019


DpWpDynCreate: created new work process W18-3336

Mon Sep 16 15:01:14:102 2019


DpHdlDeadWp: W17 (pid=1377) terminated automatically

Mon Sep 16 15:06:23:367 2019


DpWpCheck: dyn W18, pid 3336 no longer needed, terminate now

Mon Sep 16 15:06:24:407 2019


DpHdlDeadWp: W18 (pid=3336) terminated automatically

Mon Sep 16 15:08:52:067 2019


DpWpDynCreate: created new work process W19-31123

Mon Sep 16 15:10:04:357 2019


DpWpDynCreate: created new work process W7-1326

Mon Sep 16 15:14:03:384 2019


DpWpCheck: dyn W19, pid 31123 no longer needed, terminate now

Mon Sep 16 15:14:03:777 2019


DpHdlDeadWp: W19 (pid=31123) terminated automatically

Mon Sep 16 15:15:18:989 2019


DpHdlDeadWp: W7 (pid=1326) terminated automatically

Mon Sep 16 15:17:59:962 2019


DpWpDynCreate: created new work process W20-4296

Mon Sep 16 15:18:00:299 2019


DpWpDynCreate: created new work process W17-4328

Mon Sep 16 15:23:00:871 2019


DpHdlDeadWp: W20 (pid=4296) terminated automatically
Mon Sep 16 15:23:03:400 2019
DpWpCheck: dyn W17, pid 4328 no longer needed, terminate now

Mon Sep 16 15:23:03:846 2019


DpHdlDeadWp: W17 (pid=4328) terminated automatically

Mon Sep 16 15:25:00:683 2019


DpWpDynCreate: created new work process W18-6840

Mon Sep 16 15:26:07:856 2019


DpWpDynCreate: created new work process W19-7201

Mon Sep 16 15:26:59:722 2019


DpWpDynCreate: created new work process W7-7450

Mon Sep 16 15:30:02:182 2019


DpHdlDeadWp: W18 (pid=6840) terminated automatically

Mon Sep 16 15:31:23:413 2019


DpWpCheck: dyn W19, pid 7201 no longer needed, terminate now

Mon Sep 16 15:31:23:645 2019


DpHdlDeadWp: W19 (pid=7201) terminated automatically

Mon Sep 16 15:32:03:554 2019


DpHdlDeadWp: W7 (pid=7450) terminated automatically

Mon Sep 16 15:36:00:627 2019


DpWpDynCreate: created new work process W20-10567

Mon Sep 16 15:41:01:972 2019


DpHdlDeadWp: W20 (pid=10567) terminated automatically

Mon Sep 16 15:41:08:770 2019


DpWpDynCreate: created new work process W17-12204

Mon Sep 16 15:46:12:476 2019


DpHdlDeadWp: W17 (pid=12204) terminated automatically

Mon Sep 16 15:51:05:261 2019


DpWpDynCreate: created new work process W18-15441

Mon Sep 16 15:54:01:023 2019


DpWpDynCreate: created new work process W19-16400
DpWpDynCreate: created new work process W7-16406

Mon Sep 16 15:56:07:293 2019


DpHdlDeadWp: W18 (pid=15441) terminated automatically

Mon Sep 16 15:59:02:441 2019


DpHdlDeadWp: W19 (pid=16400) terminated automatically
DpWpDynCreate: created new work process W19-18069

Mon Sep 16 15:59:03:594 2019


DpHdlDeadWp: W7 (pid=16406) terminated automatically

Mon Sep 16 15:59:08:616 2019


DpWpDynCreate: created new work process W20-18079
Mon Sep 16 16:04:03:605 2019
DpWpCheck: dyn W19, pid 18069 no longer needed, terminate now

Mon Sep 16 16:04:04:116 2019


DpHdlDeadWp: W19 (pid=18069) terminated automatically

Mon Sep 16 16:04:23:606 2019


DpWpCheck: dyn W20, pid 18079 no longer needed, terminate now

Mon Sep 16 16:04:25:494 2019


DpHdlDeadWp: W20 (pid=18079) terminated automatically

Mon Sep 16 16:06:01:863 2019


DpWpDynCreate: created new work process W17-4199

Mon Sep 16 16:07:02:756 2019


DpWpDynCreate: created new work process W18-7951

Mon Sep 16 16:11:03:628 2019


DpWpCheck: dyn W17, pid 4199 no longer needed, terminate now

Mon Sep 16 16:11:06:746 2019


DpHdlDeadWp: W17 (pid=4199) terminated automatically

Mon Sep 16 16:12:03:801 2019


DpHdlDeadWp: W18 (pid=7951) terminated automatically

Mon Sep 16 16:14:05:589 2019


DpWpDynCreate: created new work process W7-18864

Mon Sep 16 16:19:23:813 2019


DpWpCheck: dyn W7, pid 18864 no longer needed, terminate now

Mon Sep 16 16:19:24:298 2019


DpHdlDeadWp: W7 (pid=18864) terminated automatically

Mon Sep 16 16:25:00:426 2019


DpWpDynCreate: created new work process W19-22464

Mon Sep 16 16:26:08:036 2019


DpWpDynCreate: created new work process W20-22882

Mon Sep 16 16:26:11:441 2019


DpWpDynCreate: created new work process W17-22908

Mon Sep 16 16:30:02:313 2019


DpHdlDeadWp: W19 (pid=22464) terminated automatically

Mon Sep 16 16:31:09:739 2019


DpHdlDeadWp: W20 (pid=22882) terminated automatically

Mon Sep 16 16:31:13:543 2019


DpHdlDeadWp: W17 (pid=22908) terminated automatically

Mon Sep 16 16:36:18:830 2019


DpWpDynCreate: created new work process W18-26570

Mon Sep 16 16:38:06:110 2019


DpWpDynCreate: created new work process W7-27145

Mon Sep 16 16:40:00:633 2019


DpWpDynCreate: created new work process W19-27572

Mon Sep 16 16:41:19:888 2019


DpHdlDeadWp: W18 (pid=26570) terminated automatically

Mon Sep 16 16:43:08:688 2019


DpHdlDeadWp: W7 (pid=27145) terminated automatically

Mon Sep 16 16:45:03:867 2019


DpWpCheck: dyn W19, pid 27572 no longer needed, terminate now

Mon Sep 16 16:45:04:145 2019


DpHdlDeadWp: W19 (pid=27572) terminated automatically

Mon Sep 16 16:46:13:113 2019


DpWpDynCreate: created new work process W20-29786

Mon Sep 16 16:48:59:885 2019


DpWpDynCreate: created new work process W17-30515

Mon Sep 16 16:51:23:876 2019


DpWpCheck: dyn W20, pid 29786 no longer needed, terminate now

Mon Sep 16 16:51:24:547 2019


DpHdlDeadWp: W20 (pid=29786) terminated automatically

Mon Sep 16 16:54:00:645 2019


DpHdlDeadWp: W17 (pid=30515) terminated automatically

Mon Sep 16 16:56:14:614 2019


DpWpDynCreate: created new work process W18-724

Mon Sep 16 16:56:16:863 2019


DpWpDynCreate: created new work process W7-729

Mon Sep 16 17:01:23:894 2019


DpWpCheck: dyn W7, pid 729 no longer needed, terminate now
DpWpCheck: dyn W18, pid 724 no longer needed, terminate now

Mon Sep 16 17:01:25:090 2019


DpHdlDeadWp: W7 (pid=729) terminated automatically
DpHdlDeadWp: W18 (pid=724) terminated automatically

Mon Sep 16 17:02:14:765 2019


DpWpDynCreate: created new work process W19-5941

Mon Sep 16 17:07:23:904 2019


DpWpCheck: dyn W19, pid 5941 no longer needed, terminate now

Mon Sep 16 17:07:24:490 2019


DpHdlDeadWp: W19 (pid=5941) terminated automatically

Mon Sep 16 17:09:43:711 2019


DpWpDynCreate: created new work process W20-751

Mon Sep 16 17:15:00:104 2019


DpHdlDeadWp: W20 (pid=751) terminated automatically

Mon Sep 16 17:15:59:776 2019


DpWpDynCreate: created new work process W17-3068

Mon Sep 16 17:20:56:124 2019


DpHdlDeadWp: W12 (pid=19470) terminated automatically
DpWpDynCreate: created new work process W12-5079

Mon Sep 16 17:21:01:196 2019


DpHdlDeadWp: W17 (pid=3068) terminated automatically

Mon Sep 16 17:22:34:335 2019


DpHdlDeadWp: W10 (pid=18774) terminated automatically
DpWpDynCreate: created new work process W10-5557

Mon Sep 16 17:39:19:013 2019


DpWpDynCreate: created new work process W7-11363

Mon Sep 16 17:44:20:470 2019


DpHdlDeadWp: W7 (pid=11363) terminated automatically

Mon Sep 16 17:46:01:756 2019


DpWpDynCreate: created new work process W18-13403

Mon Sep 16 17:51:02:797 2019


DpHdlDeadWp: W18 (pid=13403) terminated automatically
DpWpDynCreate: created new work process W18-15039

Mon Sep 16 17:56:04:469 2019


DpHdlDeadWp: W18 (pid=15039) terminated automatically

Mon Sep 16 17:59:16:253 2019


DpWpDynCreate: created new work process W19-18007

Mon Sep 16 18:04:24:484 2019


DpWpCheck: dyn W19, pid 18007 no longer needed, terminate now

Mon Sep 16 18:04:24:999 2019


DpHdlDeadWp: W19 (pid=18007) terminated automatically

Mon Sep 16 18:06:00:812 2019


DpWpDynCreate: created new work process W20-3298

Mon Sep 16 18:11:04:125 2019


DpHdlDeadWp: W20 (pid=3298) terminated automatically

Mon Sep 16 18:15:30:813 2019


DpWpDynCreate: created new work process W17-18995

Mon Sep 16 18:20:44:511 2019


DpWpCheck: dyn W17, pid 18995 no longer needed, terminate now

Mon Sep 16 18:20:44:768 2019


DpHdlDeadWp: W17 (pid=18995) terminated automatically

Mon Sep 16 18:28:00:899 2019


DpWpDynCreate: created new work process W7-23260
Mon Sep 16 18:33:04:531 2019
DpWpCheck: dyn W7, pid 23260 no longer needed, terminate now

Mon Sep 16 18:33:05:433 2019


DpHdlDeadWp: W7 (pid=23260) terminated automatically

Mon Sep 16 18:39:50:325 2019


DpWpDynCreate: created new work process W18-27540

Mon Sep 16 18:44:58:993 2019


DpHdlDeadWp: W18 (pid=27540) terminated automatically

Mon Sep 16 18:45:06:563 2019


DpWpDynCreate: created new work process W19-29112

Mon Sep 16 18:50:08:387 2019


DpHdlDeadWp: W19 (pid=29112) terminated automatically

Mon Sep 16 18:53:01:549 2019


DpWpDynCreate: created new work process W20-31530

Mon Sep 16 18:53:02:393 2019


DpWpDynCreate: created new work process W17-31534

Mon Sep 16 18:58:04:051 2019


DpHdlDeadWp: W17 (pid=31534) terminated automatically

Mon Sep 16 18:58:25:072 2019


DpHdlDeadWp: W20 (pid=31530) terminated automatically

Mon Sep 16 18:59:09:009 2019


DpWpDynCreate: created new work process W7-1469

Mon Sep 16 19:04:15:812 2019


DpHdlDeadWp: W7 (pid=1469) terminated automatically

Mon Sep 16 19:05:13:018 2019


DpWpDynCreate: created new work process W18-4517

Mon Sep 16 19:10:11:096 2019


DpWpDynCreate: created new work process W19-21621

Mon Sep 16 19:10:14:246 2019


DpHdlDeadWp: W18 (pid=4517) terminated automatically

Mon Sep 16 19:15:13:889 2019


DpHdlDeadWp: W19 (pid=21621) terminated automatically

Mon Sep 16 19:18:03:802 2019


DpWpDynCreate: created new work process W17-3690

Mon Sep 16 19:23:05:114 2019


DpWpCheck: dyn W17, pid 3690 no longer needed, terminate now

Mon Sep 16 19:23:05:747 2019


DpHdlDeadWp: W17 (pid=3690) terminated automatically

Mon Sep 16 19:30:18:782 2019


DpWpDynCreate: created new work process W20-7847
Mon Sep 16 19:35:25:136 2019
DpWpCheck: dyn W20, pid 7847 no longer needed, terminate now

Mon Sep 16 19:35:25:409 2019


DpHdlDeadWp: W20 (pid=7847) terminated automatically

Mon Sep 16 19:37:00:729 2019


DpWpDynCreate: created new work process W7-10384

Mon Sep 16 19:41:13:277 2019


DpWpDynCreate: created new work process W18-11839

Mon Sep 16 19:42:03:826 2019


DpHdlDeadWp: W7 (pid=10384) terminated automatically

Mon Sep 16 19:46:14:584 2019


DpHdlDeadWp: W18 (pid=11839) terminated automatically

Mon Sep 16 19:51:02:650 2019


DpWpDynCreate: created new work process W19-15019

Mon Sep 16 19:56:00:413 2019


DpWpDynCreate: created new work process W17-16706

Mon Sep 16 19:56:05:095 2019


DpHdlDeadWp: W19 (pid=15019) terminated automatically

Mon Sep 16 19:58:02:381 2019


DpHdlDeadWp: W11 (pid=6418) terminated automatically
DpWpDynCreate: created new work process W11-17346

Mon Sep 16 20:01:02:794 2019


DpHdlDeadWp: W17 (pid=16706) terminated automatically

Mon Sep 16 20:02:10:861 2019


DpWpDynCreate: created new work process W20-19593

Mon Sep 16 20:05:46:141 2019


DpWpDynCreate: created new work process W7-26041

Mon Sep 16 20:07:12:064 2019


DpHdlDeadWp: W20 (pid=19593) terminated automatically

Mon Sep 16 20:07:14:889 2019


DpWpDynCreate: created new work process W18-29209

Mon Sep 16 20:08:51:724 2019


DpWpDynCreate: created new work process W19-306

Mon Sep 16 20:11:00:300 2019


DpHdlDeadWp: W7 (pid=26041) terminated automatically

Mon Sep 16 20:12:25:195 2019


DpWpCheck: dyn W18, pid 29209 no longer needed, terminate now

Mon Sep 16 20:12:26:349 2019


DpHdlDeadWp: W18 (pid=29209) terminated automatically
Mon Sep 16 20:14:05:197 2019
DpWpCheck: dyn W19, pid 306 no longer needed, terminate now

Mon Sep 16 20:14:05:511 2019


DpHdlDeadWp: W19 (pid=306) terminated automatically

Mon Sep 16 20:21:16:753 2019


DpWpDynCreate: created new work process W17-20778

Mon Sep 16 20:26:25:215 2019


DpWpCheck: dyn W17, pid 20778 no longer needed, terminate now

Mon Sep 16 20:26:27:353 2019


DpHdlDeadWp: W17 (pid=20778) terminated automatically

Mon Sep 16 20:27:12:152 2019


DpWpDynCreate: created new work process W20-22867

Mon Sep 16 20:32:16:604 2019


DpHdlDeadWp: W20 (pid=22867) terminated automatically

Mon Sep 16 20:35:00:643 2019


DpWpDynCreate: created new work process W7-25788

Mon Sep 16 20:40:05:239 2019


DpWpCheck: dyn W7, pid 25788 no longer needed, terminate now

Mon Sep 16 20:40:06:050 2019


DpHdlDeadWp: W7 (pid=25788) terminated automatically

Mon Sep 16 20:42:02:377 2019


DpWpDynCreate: created new work process W18-28019

Mon Sep 16 20:47:03:635 2019


DpHdlDeadWp: W18 (pid=28019) terminated automatically

Mon Sep 16 20:52:23:225 2019


DpWpDynCreate: created new work process W19-31405

Mon Sep 16 20:56:05:915 2019


DpHdlDeadWp: W9 (pid=18255) terminated automatically
DpWpDynCreate: created new work process W9-32653

Mon Sep 16 20:57:25:214 2019


DpHdlDeadWp: W19 (pid=31405) terminated automatically

Mon Sep 16 21:02:16:258 2019


DpWpDynCreate: created new work process W17-2613

Mon Sep 16 21:07:12:428 2019


DpWpDynCreate: created new work process W20-12788

Mon Sep 16 21:07:17:440 2019


DpHdlDeadWp: W17 (pid=2613) terminated automatically

Mon Sep 16 21:12:12:020 2019


DpWpDynCreate: created new work process W7-31154

Mon Sep 16 21:12:25:289 2019


DpWpCheck: dyn W20, pid 12788 no longer needed, terminate now

Mon Sep 16 21:12:25:709 2019


DpHdlDeadWp: W20 (pid=12788) terminated automatically

Mon Sep 16 21:17:14:208 2019


DpHdlDeadWp: W7 (pid=31154) terminated automatically

Mon Sep 16 21:18:01:496 2019


DpWpDynCreate: created new work process W18-3064

Mon Sep 16 21:22:11:392 2019


DpWpDynCreate: created new work process W19-4729

Mon Sep 16 21:23:05:308 2019


DpWpCheck: dyn W18, pid 3064 no longer needed, terminate now

Mon Sep 16 21:23:05:576 2019


DpHdlDeadWp: W18 (pid=3064) terminated automatically

Mon Sep 16 21:27:12:184 2019


DpWpDynCreate: created new work process W17-6376

Mon Sep 16 21:27:25:315 2019


DpWpCheck: dyn W19, pid 4729 no longer needed, terminate now

Mon Sep 16 21:27:25:674 2019


DpHdlDeadWp: W19 (pid=4729) terminated automatically

Mon Sep 16 21:32:16:548 2019


DpWpDynCreate: created new work process W20-8197

Mon Sep 16 21:32:23:485 2019


DpHdlDeadWp: W17 (pid=6376) terminated automatically

Mon Sep 16 21:37:00:927 2019


DpWpDynCreate: created new work process W7-9682

Mon Sep 16 21:37:18:310 2019


DpHdlDeadWp: W20 (pid=8197) terminated automatically

Mon Sep 16 21:42:01:431 2019


DpHdlDeadWp: W7 (pid=9682) terminated automatically

Mon Sep 16 21:43:01:520 2019


DpWpDynCreate: created new work process W18-11717

Mon Sep 16 21:48:04:827 2019


DpHdlDeadWp: W18 (pid=11717) terminated automatically

Mon Sep 16 21:52:23:459 2019


DpWpDynCreate: created new work process W19-14774

Mon Sep 16 21:52:23:565 2019


DpWpDynCreate: created new work process W17-14776

Mon Sep 16 21:57:25:675 2019


DpHdlDeadWp: W17 (pid=14776) terminated automatically
DpWpCheck: dyn W19, pid 14774 no longer needed, terminate now
Mon Sep 16 21:57:26:751 2019
DpHdlDeadWp: W19 (pid=14774) terminated automatically

Mon Sep 16 21:59:04:655 2019


DpWpDynCreate: created new work process W20-16851

Mon Sep 16 22:02:16:132 2019


DpWpDynCreate: created new work process W7-20745

Mon Sep 16 22:04:00:582 2019


DpWpDynCreate: created new work process W18-27273

Mon Sep 16 22:04:06:006 2019


DpHdlDeadWp: W20 (pid=16851) terminated automatically

Mon Sep 16 22:07:18:795 2019


DpHdlDeadWp: W7 (pid=20745) terminated automatically

Mon Sep 16 22:09:02:436 2019


DpHdlDeadWp: W18 (pid=27273) terminated automatically

Mon Sep 16 22:09:03:940 2019


DpWpDynCreate: created new work process W17-15388

Mon Sep 16 22:14:05:052 2019


DpHdlDeadWp: W17 (pid=15388) terminated automatically

Mon Sep 16 22:18:02:801 2019


DpWpDynCreate: created new work process W19-18454

Mon Sep 16 22:23:04:594 2019


DpHdlDeadWp: W19 (pid=18454) terminated automatically

Mon Sep 16 22:37:18:317 2019


DpWpDynCreate: created new work process W20-25412

Mon Sep 16 22:42:20:580 2019


DpHdlDeadWp: W20 (pid=25412) terminated automatically

Mon Sep 16 22:46:01:674 2019


DpWpDynCreate: created new work process W7-28099

Mon Sep 16 22:46:01:811 2019


DpWpDynCreate: created new work process W18-28104

Mon Sep 16 22:51:03:143 2019


DpHdlDeadWp: W18 (pid=28104) terminated automatically

Mon Sep 16 22:51:35:524 2019


DpHdlDeadWp: W7 (pid=28099) terminated automatically

Mon Sep 16 22:52:02:892 2019


DpWpDynCreate: created new work process W17-30114

Mon Sep 16 22:57:04:849 2019


DpHdlDeadWp: W17 (pid=30114) terminated automatically

Mon Sep 16 22:59:10:327 2019


DpWpDynCreate: created new work process W19-32615

Mon Sep 16 23:04:11:902 2019


DpHdlDeadWp: W19 (pid=32615) terminated automatically

Mon Sep 16 23:07:11:619 2019


DpWpDynCreate: created new work process W20-19295

Mon Sep 16 23:12:26:130 2019


DpWpCheck: dyn W20, pid 19295 no longer needed, terminate now

Mon Sep 16 23:12:26:744 2019


DpHdlDeadWp: W20 (pid=19295) terminated automatically

Mon Sep 16 23:22:11:393 2019


DpWpDynCreate: created new work process W18-3553

Mon Sep 16 23:27:15:859 2019


DpHdlDeadWp: W18 (pid=3553) terminated automatically

Mon Sep 16 23:28:00:014 2019


DpWpDynCreate: created new work process W7-5535

Mon Sep 16 23:28:00:223 2019


DpWpDynCreate: created new work process W17-5547

Mon Sep 16 23:33:01:951 2019


DpHdlDeadWp: W7 (pid=5535) terminated automatically

Mon Sep 16 23:33:02:789 2019


DpHdlDeadWp: W17 (pid=5547) terminated automatically

Mon Sep 16 23:34:05:381 2019


DpWpDynCreate: created new work process W19-7525

Mon Sep 16 23:36:07:944 2019


DpWpDynCreate: created new work process W20-8312

Mon Sep 16 23:39:06:178 2019


DpWpCheck: dyn W19, pid 7525 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=7525) terminated automatically

Mon Sep 16 23:41:11:599 2019


DpHdlDeadWp: W20 (pid=8312) terminated automatically

Mon Sep 16 23:42:08:775 2019


DpWpDynCreate: created new work process W18-10436

Mon Sep 16 23:47:09:129 2019


DpHdlDeadWp: W18 (pid=10436) terminated automatically

Mon Sep 16 23:47:09:258 2019


DpWpDynCreate: created new work process W7-12012

Mon Sep 16 23:48:02:908 2019


DpWpDynCreate: created new work process W17-12230

Mon Sep 16 23:52:26:201 2019


DpWpCheck: dyn W7, pid 12012 no longer needed, terminate now
Mon Sep 16 23:52:27:224 2019
DpHdlDeadWp: W7 (pid=12012) terminated automatically

Mon Sep 16 23:53:05:358 2019


DpHdlDeadWp: W17 (pid=12230) terminated automatically

Mon Sep 16 23:57:12:691 2019


DpWpDynCreate: created new work process W19-15287

Tue Sep 17 00:00:04:985 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 9->8, VB 1->1, ENQ 0->0, BTC 5->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5
DpWpConf: wp reconfiguration, stop W19, pid 15287
DpAdaptWppriv_max_no : 4 -> 4

Tue Sep 17 00:00:05:083 2019


DpHdlDeadWp: W19 (pid=15287) terminated automatically

Tue Sep 17 00:02:03:313 2019


DpWpDynCreate: created new work process W20-16923

Tue Sep 17 00:07:06:226 2019


DpWpCheck: dyn W20, pid 16923 no longer needed, terminate now

Tue Sep 17 00:07:08:414 2019


DpHdlDeadWp: W20 (pid=16923) terminated automatically

Tue Sep 17 00:22:12:627 2019


DpWpDynCreate: created new work process W18-26774

Tue Sep 17 00:22:14:381 2019


DpWpDynCreate: created new work process W7-26849

Tue Sep 17 00:27:13:232 2019


DpHdlDeadWp: W18 (pid=26774) terminated automatically

Tue Sep 17 00:27:26:258 2019


DpWpCheck: dyn W7, pid 26849 no longer needed, terminate now

Tue Sep 17 00:27:28:418 2019


DpHdlDeadWp: W7 (pid=26849) terminated automatically

Tue Sep 17 00:29:00:762 2019


DpWpDynCreate: created new work process W17-5760

Tue Sep 17 00:33:19:869 2019


DpWpDynCreate: created new work process W19-13228

Tue Sep 17 00:34:06:272 2019


DpWpCheck: dyn W17, pid 5760 no longer needed, terminate now

Tue Sep 17 00:34:06:870 2019


DpHdlDeadWp: W17 (pid=5760) terminated automatically

Tue Sep 17 00:38:26:282 2019


DpWpCheck: dyn W19, pid 13228 no longer needed, terminate now
Tue Sep 17 00:38:27:241 2019
DpHdlDeadWp: W19 (pid=13228) terminated automatically

Tue Sep 17 00:42:48:479 2019


DpWpDynCreate: created new work process W20-16943

Tue Sep 17 00:48:05:351 2019


DpHdlDeadWp: W20 (pid=16943) terminated automatically

Tue Sep 17 00:55:03:487 2019


DpWpDynCreate: created new work process W18-22964

Tue Sep 17 01:00:05:127 2019


DpHdlDeadWp: W18 (pid=22964) terminated automatically

Tue Sep 17 01:02:03:432 2019


DpWpDynCreate: created new work process W7-6334

Tue Sep 17 01:07:04:723 2019


DpHdlDeadWp: W7 (pid=6334) terminated automatically

Tue Sep 17 01:07:18:834 2019


DpWpDynCreate: created new work process W17-21060

Tue Sep 17 01:12:19:738 2019


DpHdlDeadWp: W17 (pid=21060) terminated automatically

Tue Sep 17 01:14:00:044 2019


DpWpDynCreate: created new work process W19-4298

Tue Sep 17 01:19:01:803 2019


DpHdlDeadWp: W19 (pid=4298) terminated automatically

Tue Sep 17 01:19:41:793 2019


DpWpDynCreate: created new work process W20-6571

Tue Sep 17 01:19:45:872 2019


DpWpDynCreate: created new work process W18-6578

Tue Sep 17 01:22:01:016 2019


DpWpDynCreate: created new work process W7-7323

Tue Sep 17 01:24:46:360 2019


DpWpCheck: dyn W18, pid 6578 no longer needed, terminate now
DpWpCheck: dyn W20, pid 6571 no longer needed, terminate now

Tue Sep 17 01:24:47:157 2019


DpHdlDeadWp: W18 (pid=6578) terminated automatically
DpHdlDeadWp: W20 (pid=6571) terminated automatically

Tue Sep 17 01:25:06:989 2019


DpWpDynCreate: created new work process W17-8546

Tue Sep 17 01:27:06:364 2019


DpWpCheck: dyn W7, pid 7323 no longer needed, terminate now

Tue Sep 17 01:27:07:348 2019


DpHdlDeadWp: W7 (pid=7323) terminated automatically
Tue Sep 17 01:28:06:064 2019
DpHdlDeadWp: W12 (pid=5079) terminated automatically
DpWpDynCreate: created new work process W12-9522

Tue Sep 17 01:29:00:764 2019


DpWpDynCreate: created new work process W19-9855

Tue Sep 17 01:30:26:369 2019


DpWpCheck: dyn W17, pid 8546 no longer needed, terminate now

Tue Sep 17 01:30:26:510 2019


DpHdlDeadWp: W17 (pid=8546) terminated automatically

Tue Sep 17 01:34:06:374 2019


DpWpCheck: dyn W19, pid 9855 no longer needed, terminate now

Tue Sep 17 01:34:06:748 2019


DpHdlDeadWp: W19 (pid=9855) terminated automatically

Tue Sep 17 01:37:01:016 2019


DpWpDynCreate: created new work process W18-12409

Tue Sep 17 01:40:46:074 2019


DpHdlDeadWp: W11 (pid=17346) terminated automatically
DpWpDynCreate: created new work process W11-13745

Tue Sep 17 01:42:06:388 2019


DpWpCheck: dyn W18, pid 12409 no longer needed, terminate now

Tue Sep 17 01:42:07:258 2019


DpHdlDeadWp: W18 (pid=12409) terminated automatically

Tue Sep 17 01:42:11:706 2019


DpWpDynCreate: created new work process W20-14238

Tue Sep 17 01:47:26:400 2019


DpWpCheck: dyn W20, pid 14238 no longer needed, terminate now

Tue Sep 17 01:47:26:582 2019


DpHdlDeadWp: W20 (pid=14238) terminated automatically

Tue Sep 17 01:49:07:245 2019


DpWpDynCreate: created new work process W7-16647

Tue Sep 17 01:51:38:621 2019


DpWpDynCreate: created new work process W17-17453

Tue Sep 17 01:54:26:411 2019


DpWpCheck: dyn W7, pid 16647 no longer needed, terminate now

Tue Sep 17 01:54:27:029 2019


DpHdlDeadWp: W7 (pid=16647) terminated automatically

Tue Sep 17 01:56:46:414 2019


DpWpCheck: dyn W17, pid 17453 no longer needed, terminate now

Tue Sep 17 01:56:47:243 2019


DpHdlDeadWp: W17 (pid=17453) terminated automatically
Tue Sep 17 02:02:23:519 2019
DpWpDynCreate: created new work process W19-23837

Tue Sep 17 02:07:27:898 2019


DpHdlDeadWp: W19 (pid=23837) terminated automatically

Tue Sep 17 02:11:03:550 2019


DpWpDynCreate: created new work process W18-15145

Tue Sep 17 02:12:05:099 2019


DpWpDynCreate: created new work process W20-18925

Tue Sep 17 02:16:04:678 2019


DpHdlDeadWp: W18 (pid=15145) terminated automatically

Tue Sep 17 02:17:06:458 2019


DpWpCheck: dyn W20, pid 18925 no longer needed, terminate now

Tue Sep 17 02:17:06:693 2019


DpHdlDeadWp: W20 (pid=18925) terminated automatically

Tue Sep 17 02:18:07:951 2019


DpWpDynCreate: created new work process W7-22041

Tue Sep 17 02:23:26:469 2019


DpWpCheck: dyn W7, pid 22041 no longer needed, terminate now

Tue Sep 17 02:23:26:578 2019


DpHdlDeadWp: W7 (pid=22041) terminated automatically

Tue Sep 17 02:24:04:009 2019


DpWpDynCreate: created new work process W17-23992

Tue Sep 17 02:24:41:995 2019


DpWpDynCreate: created new work process W19-24386

Tue Sep 17 02:24:42:014 2019


DpWpDynCreate: created new work process W18-24387

Tue Sep 17 02:24:42:337 2019


DpWpDynCreate: created new work process W20-24389

Tue Sep 17 02:29:06:479 2019


DpWpCheck: dyn W17, pid 23992 no longer needed, terminate now

Tue Sep 17 02:29:06:811 2019


DpHdlDeadWp: W17 (pid=23992) terminated automatically

Tue Sep 17 02:29:46:510 2019


DpHdlDeadWp: W18 (pid=24387) terminated automatically
DpWpCheck: dyn W19, pid 24386 no longer needed, terminate now
DpWpCheck: dyn W20, pid 24389 no longer needed, terminate now

Tue Sep 17 02:29:47:589 2019


DpHdlDeadWp: W19 (pid=24386) terminated automatically
DpHdlDeadWp: W20 (pid=24389) terminated automatically

Tue Sep 17 02:43:01:169 2019


DpWpDynCreate: created new work process W7-30160
Tue Sep 17 02:48:01:112 2019
DpWpDynCreate: created new work process W17-32068

Tue Sep 17 02:48:03:172 2019


DpHdlDeadWp: W7 (pid=30160) terminated automatically

Tue Sep 17 02:53:05:505 2019


DpHdlDeadWp: W17 (pid=32068) terminated automatically

Tue Sep 17 02:57:03:298 2019


DpHdlSoftCancel: cancel request for T96_U809_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T26_U808_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:57:08:301 2019


DpHdlSoftCancel: cancel request for T105_U810_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T86_U811_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:57:13:306 2019


DpHdlSoftCancel: cancel request for T7_U814_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T117_U815_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:57:23:313 2019


DpHdlSoftCancel: cancel request for T13_U817_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:57:28:313 2019


DpHdlSoftCancel: cancel request for T2_U819_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T98_U821_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T43_U822_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:57:33:314 2019


DpHdlSoftCancel: cancel request for T97_U823_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:57:52:701 2019


DpHdlSoftCancel: cancel request for T74_U845_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T67_U843_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T27_U844_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T112_U842_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T6_U838_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T119_U839_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T57_U841_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T8_U836_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T153_U840_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:57:57:706 2019


DpHdlSoftCancel: cancel request for T50_U847_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T107_U849_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:58:18:716 2019


DpHdlSoftCancel: cancel request for T126_U895_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T4_U893_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:58:53:287 2019


DpHdlSoftCancel: cancel request for T69_U865_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:59:08:292 2019


DpHdlSoftCancel: cancel request for T81_U871_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:59:13:296 2019


DpHdlSoftCancel: cancel request for T52_U874_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T83_U875_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:59:28:306 2019


DpHdlSoftCancel: cancel request for T120_U878_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:59:52:381 2019


DpHdlSoftCancel: cancel request for T94_U882_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 02:59:57:386 2019


DpHdlSoftCancel: cancel request for T9_U885_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:00:454 2019


DpSendLoadInfo: quota for load / queue fill level = 7.200000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 7.203686 /
1.350000

Tue Sep 17 03:00:03:389 2019


DpHdlSoftCancel: cancel request for T142_U892_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T128_U894_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:08:393 2019


DpHdlSoftCancel: cancel request for T84_U896_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:23:397 2019


DpHdlSoftCancel: cancel request for T1_U903_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:28:400 2019


DpHdlSoftCancel: cancel request for T64_U906_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T149_U905_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:33:402 2019


DpHdlSoftCancel: cancel request for T71_U907_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:38:407 2019


DpHdlSoftCancel: cancel request for T58_U825_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:48:410 2019


DpHdlSoftCancel: cancel request for T95_U916_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T89_U911_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T99_U915_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T88_U909_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T15_U913_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:53:415 2019


DpHdlSoftCancel: cancel request for T91_U917_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T31_U918_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T77_U919_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:00:58:416 2019


DpHdlSoftCancel: cancel request for T23_U920_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:01:06:566 2019


DpUpdateStatusFileWith: state=YELLOW, reason=Request handling without progress

Tue Sep 17 03:01:06:815 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (1. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:07:816 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (2. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:08:421 2019


DpHdlSoftCancel: cancel request for T141_U926_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:01:08:817 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (3. check) [dpxxwp.c 4705]
Tue Sep 17 03:01:09:818 2019
*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (4. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:10:818 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (5. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:11:820 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (6. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:12:821 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (7. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:13:425 2019


DpHdlSoftCancel: cancel request for T53_U929_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:01:13:821 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (8. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:14:822 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (9. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:15:823 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (10. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:16:824 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (11. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:17:825 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (12. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:18:826 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (13. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:19:827 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (14. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:20:828 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (15. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:21:829 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (16. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:22:830 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (17. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:23:832 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (18. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:24:833 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (19. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:25:833 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (20. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:26:834 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (21. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:27:835 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (22. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:28:836 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (23. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:29:837 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (24. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:30:838 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (25. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:31:838 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (26. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:32:839 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (27. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:33:840 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (28. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:34:840 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (29. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:35:841 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (30. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:36:842 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (31. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:37:843 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (32. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:38:844 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (33. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:39:845 2019


DpUpdateStatusFileWith: state=YELLOW, reason=Request handling without progress
*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (34. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:40:846 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (35. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:41:847 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (36. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:42:847 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (37. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:43:847 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (38. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:44:848 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (39. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:45:848 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (40. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:46:849 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (41. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:47:850 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (42. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:48:851 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (43. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:49:852 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (44. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:50:853 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (45. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:51:854 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (46. check) [dpxxwp.c 4705]
Tue Sep 17 03:01:52:856 2019
*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (47. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:53:857 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (48. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:54:183 2019


DpHdlSoftCancel: cancel request for T73_U941_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:01:54:857 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (49. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:55:858 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (50. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:56:859 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (51. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:57:860 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (52. check) [dpxxwp.c 4705]

Tue Sep 17 03:01:58:860 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (53. check) [dpxxwp.c 4705]

Tue Sep 17 03:02:00:057 2019


DpHdlSoftCancel: cancel request for T101_U947_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Tue Sep 17 03:02:02:611 2019


DpWpDynCreate: created new work process W18-6667

Tue Sep 17 03:02:02:921 2019


DpSendLoadInfo: queue DIA no longer with high load

Tue Sep 17 03:02:03:791 2019


DpWpDynCreate: created new work process W19-6672

Tue Sep 17 03:04:18:924 2019


DpWpDynCreate: created new work process W20-12535

Tue Sep 17 03:07:04:960 2019


DpWpCheck: dyn W18, pid 6667 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=6672) terminated automatically

Tue Sep 17 03:07:05:215 2019


DpHdlDeadWp: W18 (pid=6667) terminated automatically

Tue Sep 17 03:08:19:980 2019


DpWpDynCreate: created new work process W7-23743
Tue Sep 17 03:08:23:803 2019
DpWpDynCreate: created new work process W17-24056

Tue Sep 17 03:09:26:579 2019


DpWpCheck: dyn W20, pid 12535 no longer needed, terminate now

Tue Sep 17 03:09:27:330 2019


DpHdlDeadWp: W20 (pid=12535) terminated automatically

Tue Sep 17 03:13:26:587 2019


DpWpCheck: dyn W7, pid 23743 no longer needed, terminate now
DpWpCheck: dyn W17, pid 24056 no longer needed, terminate now

Tue Sep 17 03:13:27:642 2019


DpHdlDeadWp: W7 (pid=23743) terminated automatically
DpHdlDeadWp: W17 (pid=24056) terminated automatically

Tue Sep 17 03:20:05:484 2019


DpWpDynCreate: created new work process W19-6132

Tue Sep 17 03:20:17:493 2019


DpWpDynCreate: created new work process W18-6223

Tue Sep 17 03:23:22:377 2019


DpWpDynCreate: created new work process W20-7338

Tue Sep 17 03:25:14:065 2019


DpHdlDeadWp: W19 (pid=6132) terminated automatically

Tue Sep 17 03:25:26:608 2019


DpWpCheck: dyn W18, pid 6223 no longer needed, terminate now

Tue Sep 17 03:25:27:086 2019


DpHdlDeadWp: W18 (pid=6223) terminated automatically

Tue Sep 17 03:28:26:640 2019


DpHdlDeadWp: W20 (pid=7338) terminated automatically

Tue Sep 17 03:42:00:661 2019


DpWpDynCreate: created new work process W7-13584

Tue Sep 17 03:47:01:840 2019


DpHdlDeadWp: W7 (pid=13584) terminated automatically

Tue Sep 17 03:51:08:599 2019


DpWpDynCreate: created new work process W17-16713

Tue Sep 17 03:53:17:653 2019


DpWpDynCreate: created new work process W19-17566

Tue Sep 17 03:56:13:276 2019


DpHdlDeadWp: W17 (pid=16713) terminated automatically

Tue Sep 17 03:58:18:157 2019


DpHdlDeadWp: W19 (pid=17566) terminated automatically

Tue Sep 17 04:10:03:789 2019


DpWpDynCreate: created new work process W18-18556
Tue Sep 17 04:15:06:723 2019
DpWpCheck: dyn W18, pid 18556 no longer needed, terminate now

Tue Sep 17 04:15:07:280 2019


DpHdlDeadWp: W18 (pid=18556) terminated automatically

Tue Sep 17 04:20:04:975 2019


DpWpDynCreate: created new work process W20-22399

Tue Sep 17 04:25:06:740 2019


DpWpCheck: dyn W20, pid 22399 no longer needed, terminate now

Tue Sep 17 04:25:07:836 2019


DpHdlDeadWp: W20 (pid=22399) terminated automatically

Tue Sep 17 04:28:08:604 2019


DpWpDynCreate: created new work process W7-25248

Tue Sep 17 04:28:11:280 2019


DpWpDynCreate: created new work process W17-25251

Tue Sep 17 04:33:09:403 2019


DpHdlDeadWp: W7 (pid=25248) terminated automatically

Tue Sep 17 04:33:17:418 2019


DpHdlDeadWp: W17 (pid=25251) terminated automatically

Tue Sep 17 04:37:32:002 2019


DpWpDynCreate: created new work process W19-28303

Tue Sep 17 04:37:32:745 2019


DpWpDynCreate: created new work process W18-28309

Tue Sep 17 04:42:46:770 2019


DpWpCheck: dyn W18, pid 28309 no longer needed, terminate now
DpWpCheck: dyn W19, pid 28303 no longer needed, terminate now

Tue Sep 17 04:42:47:836 2019


DpHdlDeadWp: W18 (pid=28309) terminated automatically
DpHdlDeadWp: W19 (pid=28303) terminated automatically

Tue Sep 17 04:44:16:919 2019


DpWpDynCreate: created new work process W20-30472

Tue Sep 17 04:49:17:226 2019


DpHdlDeadWp: W20 (pid=30472) terminated automatically

Tue Sep 17 05:11:18:440 2019


DpWpDynCreate: created new work process W7-2675

Tue Sep 17 05:16:28:102 2019


DpHdlDeadWp: W7 (pid=2675) terminated automatically

Tue Sep 17 05:26:43:744 2019


***LOG Q0I=> NiIRead: P=198.108.67.48:32076; L=10.54.36.29:3200: recv (104:
Connection reset by peer) [/bas/749_REL/src/base/ni/nixxi.cpp 5420]
*** ERROR => NiIRead: SiRecv failed for hdl 49/sock 14
(SI_ECONN_BROKEN/104; I4; ST; P=198.108.67.48:32076; L=10.54.36.29:3200)
[nixxi.cpp 5420]
Tue Sep 17 05:26:43:891 2019
*** ERROR => NiIRead: invalid data (0x47455420/0x8800;mode=0;hdl
50;peer=198.108.67.48:59856;local=3200) [nixxi.cpp 5226]

Tue Sep 17 05:26:44:171 2019


*** ERROR => NiIRead: invalid data (0x16030100/0x8800;mode=0;hdl
51;peer=198.108.67.48:54064;local=3200) [nixxi.cpp 5226]

Tue Sep 17 05:28:00:778 2019


DpWpDynCreate: created new work process W17-8594

Tue Sep 17 05:33:01:919 2019


DpHdlDeadWp: W17 (pid=8594) terminated automatically

Tue Sep 17 05:43:00:808 2019


DpWpDynCreate: created new work process W18-13820

Tue Sep 17 05:48:05:453 2019


DpHdlDeadWp: W18 (pid=13820) terminated automatically

Tue Sep 17 05:51:11:911 2019


DpWpDynCreate: created new work process W19-16514

Tue Sep 17 05:55:23:959 2019


DpHdlDeadWp: W12 (pid=9522) terminated automatically
DpWpDynCreate: created new work process W12-17824

Tue Sep 17 05:56:26:904 2019


DpWpCheck: dyn W19, pid 16514 no longer needed, terminate now

Tue Sep 17 05:56:27:183 2019


DpHdlDeadWp: W19 (pid=16514) terminated automatically

Tue Sep 17 05:59:05:324 2019


DpUpdateStatusFileWith: state=YELLOW, reason=Length of high priority queue exceeds
limit

Tue Sep 17 06:00:01:317 2019


DpWpDynCreate: created new work process W20-19311

Tue Sep 17 06:03:04:626 2019


DpHdlDeadWp: W9 (pid=32653) terminated automatically
DpWpDynCreate: created new work process W9-25625

Tue Sep 17 06:05:04:738 2019


DpHdlDeadWp: W20 (pid=19311) terminated automatically

Tue Sep 17 06:10:40:880 2019


DpWpDynCreate: created new work process W7-18814

Tue Sep 17 06:15:46:944 2019


DpWpCheck: dyn W7, pid 18814 no longer needed, terminate now

Tue Sep 17 06:15:47:658 2019


DpHdlDeadWp: W7 (pid=18814) terminated automatically

Tue Sep 17 06:16:05:765 2019


DpWpDynCreate: created new work process W17-20796
Tue Sep 17 06:21:06:908 2019
DpHdlDeadWp: W17 (pid=20796) terminated automatically

Tue Sep 17 06:24:23:424 2019


DpWpDynCreate: created new work process W18-23536

Tue Sep 17 06:29:26:968 2019


DpWpCheck: dyn W18, pid 23536 no longer needed, terminate now

Tue Sep 17 06:29:28:579 2019


DpHdlDeadWp: W18 (pid=23536) terminated automatically

Tue Sep 17 06:31:04:327 2019


DpWpDynCreate: created new work process W19-26009

Tue Sep 17 06:31:05:348 2019


DpUpdateStatusFileWith: state=YELLOW, reason=Length of high priority queue exceeds
limit

Tue Sep 17 06:36:05:921 2019


DpHdlDeadWp: W19 (pid=26009) terminated automatically

Tue Sep 17 06:38:00:768 2019


DpWpDynCreate: created new work process W20-28285

Tue Sep 17 06:43:01:364 2019


DpHdlDeadWp: W20 (pid=28285) terminated automatically

Tue Sep 17 06:49:02:163 2019


DpWpDynCreate: created new work process W7-32008

Tue Sep 17 06:54:07:005 2019


DpWpCheck: dyn W7, pid 32008 no longer needed, terminate now

Tue Sep 17 06:54:07:290 2019


DpHdlDeadWp: W7 (pid=32008) terminated automatically

Tue Sep 17 06:56:00:968 2019


DpWpDynCreate: created new work process W17-2035

Tue Sep 17 07:01:07:016 2019


DpWpCheck: dyn W17, pid 2035 no longer needed, terminate now

Tue Sep 17 07:01:08:819 2019


DpHdlDeadWp: W17 (pid=2035) terminated automatically

Tue Sep 17 07:04:35:263 2019


DpWpDynCreate: created new work process W18-16868

Tue Sep 17 07:09:47:032 2019


DpWpCheck: dyn W18, pid 16868 no longer needed, terminate now

Tue Sep 17 07:09:47:346 2019


DpHdlDeadWp: W18 (pid=16868) terminated automatically

Tue Sep 17 07:10:08:713 2019


DpWpDynCreate: created new work process W19-2244
Tue Sep 17 07:10:36:659 2019
DpWpDynCreate: created new work process W20-2531

Tue Sep 17 07:15:10:068 2019


DpHdlDeadWp: W19 (pid=2244) terminated automatically

Tue Sep 17 07:15:47:042 2019


DpWpCheck: dyn W20, pid 2531 no longer needed, terminate now

Tue Sep 17 07:15:47:507 2019


DpHdlDeadWp: W20 (pid=2531) terminated automatically

Tue Sep 17 07:18:00:425 2019


DpWpDynCreate: created new work process W7-5194

Tue Sep 17 07:23:01:987 2019


DpHdlDeadWp: W7 (pid=5194) terminated automatically

Tue Sep 17 07:30:16:575 2019


DpWpDynCreate: created new work process W17-9506

Tue Sep 17 07:35:17:138 2019


DpHdlDeadWp: W17 (pid=9506) terminated automatically
DpWpDynCreate: created new work process W18-11314

Tue Sep 17 07:40:18:770 2019


DpHdlDeadWp: W18 (pid=11314) terminated automatically

Tue Sep 17 07:49:41:038 2019


DpWpDynCreate: created new work process W19-16103

Tue Sep 17 07:52:12:972 2019


DpWpDynCreate: created new work process W20-16901

Tue Sep 17 07:54:43:653 2019


DpHdlDeadWp: W19 (pid=16103) terminated automatically

Tue Sep 17 07:55:00:973 2019


DpWpDynCreate: created new work process W7-17678

Tue Sep 17 07:56:16:045 2019


DpHdlDeadWp: W11 (pid=13745) terminated automatically
DpWpDynCreate: created new work process W11-18210

Tue Sep 17 07:57:27:110 2019


DpWpCheck: dyn W20, pid 16901 no longer needed, terminate now

Tue Sep 17 07:57:27:932 2019


DpHdlDeadWp: W20 (pid=16901) terminated automatically

Tue Sep 17 08:00:01:693 2019


DpHdlDeadWp: W7 (pid=17678) terminated automatically

Tue Sep 17 08:00:09:591 2019


DpWpDynCreate: created new work process W17-19565

Tue Sep 17 08:00:15:686 2019


DpWpDynCreate: created new work process W18-19600
Tue Sep 17 08:05:21:133 2019
DpWpCheck: dyn W17, pid 19565 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=19600) terminated automatically
DpHdlDeadWp: W17 (pid=19565) terminated automatically

Tue Sep 17 08:11:47:205 2019


DpWpDynCreate: created new work process W19-19084

Tue Sep 17 08:15:01:544 2019


DpWpDynCreate: created new work process W20-20083

Tue Sep 17 08:16:49:450 2019


DpHdlDeadWp: W19 (pid=19084) terminated automatically

Tue Sep 17 08:20:05:544 2019


DpHdlDeadWp: W20 (pid=20083) terminated automatically

Tue Sep 17 08:30:13:260 2019


DpWpDynCreate: created new work process W7-25889

Tue Sep 17 08:35:15:001 2019


DpHdlDeadWp: W7 (pid=25889) terminated automatically

Tue Sep 17 08:38:00:575 2019


DpWpDynCreate: created new work process W18-28352
DpWpDynCreate: created new work process W17-28353

Tue Sep 17 08:43:01:412 2019


DpHdlDeadWp: W17 (pid=28353) terminated automatically
DpHdlDeadWp: W18 (pid=28352) terminated automatically

Tue Sep 17 08:46:02:675 2019


DpWpDynCreate: created new work process W19-31137

Tue Sep 17 08:50:12:433 2019


DpWpDynCreate: created new work process W20-32455

Tue Sep 17 08:51:03:194 2019


DpHdlDeadWp: W19 (pid=31137) terminated automatically

Tue Sep 17 08:55:13:697 2019


DpHdlDeadWp: W20 (pid=32455) terminated automatically

Tue Sep 17 08:57:02:545 2019


DpWpDynCreate: created new work process W7-2663

Tue Sep 17 09:00:14:535 2019


DpWpDynCreate: created new work process W17-3735

Tue Sep 17 09:02:04:830 2019


DpHdlDeadWp: W7 (pid=2663) terminated automatically

Tue Sep 17 09:05:15:145 2019


DpHdlDeadWp: W17 (pid=3735) terminated automatically

Tue Sep 17 09:09:01:953 2019


DpWpDynCreate: created new work process W18-2025

Tue Sep 17 09:09:02:339 2019


DpWpDynCreate: created new work process W19-2033

Tue Sep 17 09:14:03:482 2019


DpHdlDeadWp: W18 (pid=2025) terminated automatically

Tue Sep 17 09:14:03:876 2019


DpHdlDeadWp: W19 (pid=2033) terminated automatically
DpWpDynCreate: created new work process W19-3837

Tue Sep 17 09:19:07:244 2019


DpWpCheck: dyn W19, pid 3837 no longer needed, terminate now

Tue Sep 17 09:19:07:811 2019


DpHdlDeadWp: W19 (pid=3837) terminated automatically

Tue Sep 17 09:21:01:497 2019


DpWpDynCreate: created new work process W20-6667

Tue Sep 17 09:26:02:370 2019


DpHdlDeadWp: W20 (pid=6667) terminated automatically

Tue Sep 17 09:31:18:971 2019


DpWpDynCreate: created new work process W7-10285

Tue Sep 17 09:36:20:844 2019


DpHdlDeadWp: W7 (pid=10285) terminated automatically

Tue Sep 17 09:37:10:170 2019


DpWpDynCreate: created new work process W17-12443

Tue Sep 17 09:42:12:737 2019


DpHdlDeadWp: W17 (pid=12443) terminated automatically

Tue Sep 17 09:56:19:510 2019


DpWpDynCreate: created new work process W18-18713

Tue Sep 17 10:01:27:314 2019


DpWpCheck: dyn W18, pid 18713 no longer needed, terminate now

Tue Sep 17 10:01:27:751 2019


DpHdlDeadWp: W18 (pid=18713) terminated automatically

Tue Sep 17 10:06:46:317 2019


DpWpDynCreate: created new work process W19-7516

Tue Sep 17 10:11:47:339 2019


DpWpCheck: dyn W19, pid 7516 no longer needed, terminate now

Tue Sep 17 10:11:48:384 2019


DpHdlDeadWp: W19 (pid=7516) terminated automatically

Tue Sep 17 10:13:05:678 2019


DpWpDynCreate: created new work process W20-20012

Tue Sep 17 10:18:07:630 2019


DpHdlDeadWp: W20 (pid=20012) terminated automatically

Tue Sep 17 10:22:08:122 2019


DpWpDynCreate: created new work process W7-23305
Tue Sep 17 10:27:10:465 2019
DpHdlDeadWp: W7 (pid=23305) terminated automatically

Tue Sep 17 10:42:14:003 2019


DpWpDynCreate: created new work process W17-30229

Tue Sep 17 10:47:27:681 2019


DpWpCheck: dyn W17, pid 30229 no longer needed, terminate now

Tue Sep 17 10:47:28:235 2019


DpHdlDeadWp: W17 (pid=30229) terminated automatically

Tue Sep 17 10:49:00:637 2019


DpWpDynCreate: created new work process W18-32273

Tue Sep 17 10:54:02:600 2019


DpHdlDeadWp: W18 (pid=32273) terminated automatically

Tue Sep 17 10:54:05:659 2019


DpWpDynCreate: created new work process W19-1631

Tue Sep 17 10:59:06:832 2019


DpHdlDeadWp: W19 (pid=1631) terminated automatically

Tue Sep 17 11:01:49:505 2019


DpWpDynCreate: created new work process W20-6687

Tue Sep 17 11:06:50:696 2019


DpHdlDeadWp: W20 (pid=6687) terminated automatically

Tue Sep 17 11:06:57:547 2019


DpWpDynCreate: created new work process W7-23067

Tue Sep 17 11:12:00:575 2019


DpHdlDeadWp: W7 (pid=23067) terminated automatically

Tue Sep 17 11:12:10:382 2019


DpWpDynCreate: created new work process W17-3601

Tue Sep 17 11:17:15:027 2019


DpHdlDeadWp: W17 (pid=3601) terminated automatically

Tue Sep 17 11:19:07:674 2019


DpHdlDeadWp: W12 (pid=17824) terminated automatically
DpWpDynCreate: created new work process W12-6480

Tue Sep 17 11:36:00:717 2019


DpWpDynCreate: created new work process W18-12026

Tue Sep 17 11:41:07:771 2019


DpWpCheck: dyn W18, pid 12026 no longer needed, terminate now

Tue Sep 17 11:41:07:976 2019


DpHdlDeadWp: W18 (pid=12026) terminated automatically

Tue Sep 17 11:43:00:574 2019


DpWpDynCreate: created new work process W19-14267
Tue Sep 17 11:48:06:393 2019
DpHdlDeadWp: W19 (pid=14267) terminated automatically

Tue Sep 17 11:53:10:727 2019


DpWpDynCreate: created new work process W20-17493

Tue Sep 17 11:58:16:023 2019


DpHdlDeadWp: W20 (pid=17493) terminated automatically

Tue Sep 17 12:04:05:686 2019


DpWpDynCreate: created new work process W7-26478

Tue Sep 17 12:09:06:438 2019


DpHdlDeadWp: W7 (pid=26478) terminated automatically

Tue Sep 17 12:09:48:179 2019


DpWpDynCreate: created new work process W17-12321

Tue Sep 17 12:15:00:796 2019


DpHdlDeadWp: W17 (pid=12321) terminated automatically

Tue Sep 17 12:16:11:279 2019


DpWpDynCreate: created new work process W18-20942

Tue Sep 17 12:21:27:838 2019


DpWpCheck: dyn W18, pid 20942 no longer needed, terminate now

Tue Sep 17 12:21:28:212 2019


DpHdlDeadWp: W18 (pid=20942) terminated automatically

Tue Sep 17 12:23:13:372 2019


DpWpDynCreate: created new work process W19-23430

Tue Sep 17 12:28:15:710 2019


DpHdlDeadWp: W19 (pid=23430) terminated automatically

Tue Sep 17 12:33:14:842 2019


DpWpDynCreate: created new work process W20-27061

Tue Sep 17 12:37:01:217 2019


DpWpDynCreate: created new work process W7-28251

Tue Sep 17 12:38:27:863 2019


DpWpCheck: dyn W20, pid 27061 no longer needed, terminate now

Tue Sep 17 12:38:28:119 2019


DpHdlDeadWp: W20 (pid=27061) terminated automatically

Tue Sep 17 12:42:02:543 2019


DpHdlDeadWp: W7 (pid=28251) terminated automatically

Tue Sep 17 12:43:12:571 2019


DpWpDynCreate: created new work process W17-30316
DpWpDynCreate: created new work process W18-30317

Tue Sep 17 12:48:13:645 2019


DpWpCheck: dyn W17, pid 30316 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=30317) terminated automatically
DpHdlDeadWp: W17 (pid=30316) terminated automatically
Tue Sep 17 12:52:48:074 2019
DpWpDynCreate: created new work process W19-1109

Tue Sep 17 12:58:01:232 2019


DpHdlDeadWp: W19 (pid=1109) terminated automatically

Tue Sep 17 12:59:01:321 2019


DpWpDynCreate: created new work process W20-3142

Tue Sep 17 13:04:07:903 2019


DpWpCheck: dyn W20, pid 3142 no longer needed, terminate now

Tue Sep 17 13:04:08:729 2019


DpHdlDeadWp: W20 (pid=3142) terminated automatically

Tue Sep 17 13:08:09:699 2019


DpWpDynCreate: created new work process W7-31659

Tue Sep 17 13:13:10:152 2019


DpHdlDeadWp: W7 (pid=31659) terminated automatically

Tue Sep 17 13:13:33:481 2019


DpWpDynCreate: created new work process W18-3684

Tue Sep 17 13:18:39:685 2019


DpHdlDeadWp: W18 (pid=3684) terminated automatically

Tue Sep 17 13:18:42:745 2019


DpWpDynCreate: created new work process W17-5422
DpWpDynCreate: created new work process W19-5423

Tue Sep 17 13:23:48:750 2019


DpHdlDeadWp: W17 (pid=5422) terminated automatically
DpWpCheck: dyn W19, pid 5423 no longer needed, terminate now

Tue Sep 17 13:23:49:830 2019


DpHdlDeadWp: W19 (pid=5423) terminated automatically

Tue Sep 17 13:29:01:072 2019


DpWpDynCreate: created new work process W20-8893

Tue Sep 17 13:34:03:971 2019


DpHdlDeadWp: W20 (pid=8893) terminated automatically

Tue Sep 17 13:43:02:976 2019


DpWpDynCreate: created new work process W7-13747

Tue Sep 17 13:43:15:223 2019


DpWpDynCreate: created new work process W18-13884

Tue Sep 17 13:48:03:899 2019


DpHdlDeadWp: W7 (pid=13747) terminated automatically

Tue Sep 17 13:48:17:673 2019


DpHdlDeadWp: W18 (pid=13884) terminated automatically

Tue Sep 17 13:53:10:421 2019


DpWpDynCreate: created new work process W17-17019
Tue Sep 17 13:54:06:787 2019
DpWpDynCreate: created new work process W19-17265
DpWpDynCreate: created new work process W20-17266

Tue Sep 17 13:57:35:711 2019


DpHdlDeadWp: W12 (pid=6480) terminated automatically
DpWpDynCreate: created new work process W12-18427

Tue Sep 17 13:58:28:809 2019


DpWpCheck: dyn W17, pid 17019 no longer needed, terminate now

Tue Sep 17 13:58:29:836 2019


DpHdlDeadWp: W17 (pid=17019) terminated automatically

Tue Sep 17 13:59:08:031 2019


DpWpDynCreate: created new work process W7-18821
DpHdlDeadWp: W19 (pid=17265) terminated automatically
DpHdlDeadWp: W20 (pid=17266) terminated automatically

Tue Sep 17 14:04:28:817 2019


DpWpCheck: dyn W7, pid 18821 no longer needed, terminate now

Tue Sep 17 14:04:29:303 2019


DpHdlDeadWp: W7 (pid=18821) terminated automatically

Tue Sep 17 14:08:04:048 2019


DpWpDynCreate: created new work process W18-6958

Tue Sep 17 14:08:12:319 2019


DpWpDynCreate: created new work process W17-7490

Tue Sep 17 14:08:12:427 2019


DpWpDynCreate: created new work process W19-7496

Tue Sep 17 14:13:05:849 2019


DpHdlDeadWp: W18 (pid=6958) terminated automatically

Tue Sep 17 14:13:12:492 2019


DpWpDynCreate: created new work process W20-19099

Tue Sep 17 14:13:13:440 2019


DpHdlDeadWp: W17 (pid=7490) terminated automatically
DpWpCheck: dyn W19, pid 7496 no longer needed, terminate now

Tue Sep 17 14:13:13:900 2019


DpHdlDeadWp: W19 (pid=7496) terminated automatically

Tue Sep 17 14:18:15:442 2019


DpHdlDeadWp: W20 (pid=19099) terminated automatically

Tue Sep 17 14:22:00:917 2019


DpWpDynCreate: created new work process W7-22221

Tue Sep 17 14:22:01:060 2019


DpWpDynCreate: created new work process W18-22223

Tue Sep 17 14:24:00:837 2019


DpWpDynCreate: created new work process W17-22876
Tue Sep 17 14:27:02:019 2019
DpHdlDeadWp: W7 (pid=22221) terminated automatically
DpWpCheck: dyn W18, pid 22223 no longer needed, terminate now

Tue Sep 17 14:27:02:329 2019


DpHdlDeadWp: W18 (pid=22223) terminated automatically

Tue Sep 17 14:29:01:819 2019


DpHdlDeadWp: W17 (pid=22876) terminated automatically

Tue Sep 17 14:33:12:926 2019


DpWpDynCreate: created new work process W19-26222
DpWpDynCreate: created new work process W20-26227

Tue Sep 17 14:38:13:608 2019


DpWpCheck: dyn W19, pid 26222 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=26227) terminated automatically
DpHdlDeadWp: W19 (pid=26222) terminated automatically

Tue Sep 17 14:39:06:958 2019


DpWpDynCreate: created new work process W7-28084

Tue Sep 17 14:44:08:901 2019


DpWpCheck: dyn W7, pid 28084 no longer needed, terminate now

Tue Sep 17 14:44:09:653 2019


DpHdlDeadWp: W7 (pid=28084) terminated automatically

Tue Sep 17 14:50:00:868 2019


DpWpDynCreate: created new work process W18-31683

Tue Sep 17 14:55:02:623 2019


DpHdlDeadWp: W18 (pid=31683) terminated automatically
DpWpDynCreate: created new work process W18-869

Tue Sep 17 14:58:21:517 2019


DpWpDynCreate: created new work process W17-2071

Tue Sep 17 15:00:06:049 2019


DpHdlDeadWp: W18 (pid=869) terminated automatically

Tue Sep 17 15:03:22:720 2019


DpHdlDeadWp: W17 (pid=2071) terminated automatically

Tue Sep 17 15:08:13:232 2019


DpWpDynCreate: created new work process W20-28403
DpWpDynCreate: created new work process W19-28405

Tue Sep 17 15:09:15:051 2019


DpWpDynCreate: created new work process W7-1192

Tue Sep 17 15:13:28:956 2019


DpWpCheck: dyn W19, pid 28405 no longer needed, terminate now
DpWpCheck: dyn W20, pid 28403 no longer needed, terminate now

Tue Sep 17 15:13:29:596 2019


DpHdlDeadWp: W19 (pid=28405) terminated automatically
DpHdlDeadWp: W20 (pid=28403) terminated automatically
Tue Sep 17 15:14:28:958 2019
DpWpCheck: dyn W7, pid 1192 no longer needed, terminate now

Tue Sep 17 15:14:29:651 2019


DpHdlDeadWp: W7 (pid=1192) terminated automatically

Tue Sep 17 15:23:13:042 2019


DpWpDynCreate: created new work process W18-6404
DpWpDynCreate: created new work process W17-6405

Tue Sep 17 15:28:15:413 2019


DpWpCheck: dyn W17, pid 6405 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=6404) terminated automatically
DpHdlDeadWp: W17 (pid=6405) terminated automatically

Tue Sep 17 15:33:14:663 2019


DpWpDynCreate: created new work process W19-9697

Tue Sep 17 15:38:15:234 2019


DpHdlDeadWp: W19 (pid=9697) terminated automatically

Tue Sep 17 15:43:11:887 2019


DpWpDynCreate: created new work process W20-13042

Tue Sep 17 15:48:13:572 2019


DpHdlDeadWp: W20 (pid=13042) terminated automatically

Tue Sep 17 15:49:06:488 2019


DpWpDynCreate: created new work process W7-14931

Tue Sep 17 15:52:09:683 2019


DpWpDynCreate: created new work process W18-15895

Tue Sep 17 15:54:09:025 2019


DpWpCheck: dyn W7, pid 14931 no longer needed, terminate now

Tue Sep 17 15:54:09:588 2019


DpHdlDeadWp: W7 (pid=14931) terminated automatically

Tue Sep 17 15:57:29:110 2019


DpHdlDeadWp: W18 (pid=15895) terminated automatically

Tue Sep 17 16:02:53:070 2019


DpWpDynCreate: created new work process W17-24598

Tue Sep 17 16:08:01:512 2019


DpHdlDeadWp: W17 (pid=24598) terminated automatically

Tue Sep 17 16:08:01:916 2019


DpWpDynCreate: created new work process W19-12287

Tue Sep 17 16:13:07:171 2019


DpHdlDeadWp: W19 (pid=12287) terminated automatically

Tue Sep 17 16:13:14:445 2019


DpWpDynCreate: created new work process W20-18276

Tue Sep 17 16:18:16:069 2019


DpHdlDeadWp: W20 (pid=18276) terminated automatically

Tue Sep 17 16:22:01:369 2019


DpWpDynCreate: created new work process W7-21474

Tue Sep 17 16:27:03:125 2019


DpHdlDeadWp: W7 (pid=21474) terminated automatically

Tue Sep 17 16:33:15:161 2019


DpWpDynCreate: created new work process W18-25547

Tue Sep 17 16:38:17:688 2019


DpHdlDeadWp: W18 (pid=25547) terminated automatically

Tue Sep 17 16:43:07:836 2019


*** ERROR => DpHdlDeadWp: W12 (pid 18427) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W12 (pid = 18427)

********** SERVER SNAPSHOT 1 (Reason: Workprocess 12 died / Time: Tue Sep 17


16:43:07 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 16:43:07 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 8, standby_wps 0
#dia = 8
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 2
Queue Statistics Tue Sep 17 16:43:07 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 3 (peak 291, writeCount 10859660, readCount 10859657)


UPD : 0 (peak 31, writeCount 2226, readCount 2226)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1079999, readCount 1079999)
SPO : 0 (peak 2, writeCount 10837, readCount 10837)
UP2 : 0 (peak 1, writeCount 1045, readCount 1045)
DISP: 0 (peak 67, writeCount 416666, readCount 416666)
GW : 0 (peak 45, writeCount 9924500, readCount 9924500)
ICM : 0 (peak 186, writeCount 196049, readCount 196049)
LWP : 0 (peak 15, writeCount 16633, readCount 16633)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <T93_U27370_M0> (1 requests):


- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T24_U27368_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T5_U21579_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 67 (rq_id 29133617, NOWP, REQ_HANDLER_RFC_RESP) -> -1
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 16:43:07 2019


------------------------------------------------------------

Current snapshot id: 1


DB clean time (in percent of total time) : 23.41 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0|31517 |DIA |WP_HOLD|RFC | |low |T101_U21586_M0 |ASYNC_RFC| | |
0| |001|SM_EFWK | |
|
| 4|19804 |DIA |WP_HOLD|RFC | |low |T108_U21588_M0 |ASYNC_RFC| | |
1| |001|SM_EFWK | |
|
| 12| |BTC |WP_KILL| |1 |low |T105_U21576_M0 |BATCH | | |
| |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 16:43:07 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|ASYNC_RFC |T5_U21579_M0 |001|SM_EFWK | |16:43:04|0 |
SAPMSSY1 |low |1 |
| | 41103|
|ASYNC_RFC |T24_U27368_M0 |001|EXT_SCHAITAN| |16:43:00|16 |
SAPMSSY1 |low |1 |
| | 4237|
|SYNC_RFC |T26_U10027_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |16:41:47|16 |
SAPMSSY1 |norm| |
| | 4233|
|BGRFC_SCHEDU|T39_U484_M0 |001|BGRFC_SUSR |smprd02.niladv.org |16:43:03|2 |
SAPMSSYD |high| |
| | 4247|
|BGRFC_SCHEDU|T44_U759_M0 |001|BGRFC_SUSR |smprd02.niladv.org |16:42:03|1 |
SAPMSSY1 |high| |
| | 4234|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |16:42:57|3 |
SAPMSSY1 |norm| |
| | 4234|
|RFC_UI |T93_U27370_M0 |001|EXT_SCHAITAN| |16:43:00|2 |
SAPMSSY1 |high|1 |
| | 4237|
|ASYNC_RFC |T101_U21586_M0 |001|SM_EFWK |smprd02.niladv.org |16:43:07|0 |
SAPMSSY1 |low | |
| | 8329|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|ASYNC_RFC |T108_U21588_M0 |001|SM_EFWK |smprd02.niladv.org |16:43:06|4 |
SAPMSSY1 |low | |
| | 8329|
|SYNC_RFC |T110_U20257_M0 |001|SMD_RFC |smprd02.niladv.org |16:36:11|0 |
SAPMSSY1 |norm| |
| | 4248|
|SYNC_RFC |T138_U13519_M0 |001|SAPJSF |smprd02.niladv.org |16:42:07|16 |
SAPMSSY1 |norm| |
| | 4246|
|SYNC_RFC |T140_U793_M0 |001|SMD_RFC |smprd02.niladv.org |16:41:05|16 |
SAPMSSY1 |norm| |
| | 4249|

Found 13 logons with 13 sessions


Total ES (gross) memory of all sessions: 105 MB
Most ES (gross) memory allocated by T5_U21579_M0: 40 MB
RFC-Connection Table (13 entries) Tue Sep 17 16:43:07 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 12|42528809|42528809CU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 1|Tue Sep 17 10:51:47 2019 |
| 15|63825624|63825624SU21586_M0 |T101_U21586_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 5|Tue Sep 17 16:43:05 2019 |
| 25|63206103|63206103SU20257_M0 |T110_U20257_M0_I|ALLOCATED |
SERVER|SAP_SEND | 0|Tue Sep 17 16:36:11 2019 |
| 26|04315553|04315553SU27368_M0 |T24_U27368_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 3|Tue Sep 17 00:00:05 2019 |
| 28|61869862|61869862SU13519_M0 |T138_U13519_M0_I|ALLOCATED |
SERVER|SAP_SEND | 16|Tue Sep 17 16:42:07 2019 |
| 44|63825624|63825624CU21579_M0 |T5_U21579_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 0|Tue Sep 17 16:43:05 2019 |
| 57|58048941|58048941SU793_M0 |T140_U793_M0_I0 |ALLOCATED |
SERVER|SAP_SEND | 16|Tue Sep 17 16:41:05 2019 |
| 60|04317636|04317636SU27370_M0 |T93_U27370_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 3|Tue Sep 17 00:00:05 2019 |
| 113|63826640|63826640SU21588_M0 |T108_U21588_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 4|Tue Sep 17 16:43:06 2019 |
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 16:42:57 2019 |
| 156|63816858|63816858SU21579_M0 |T5_U21579_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Tue Sep 17 16:43:04 2019 |
| 166|63826640|63826640CU21586_M0 |T101_U21586_M0_I|ALLOCATED |
CLIENT|SAP_SEND | 0|Tue Sep 17 16:43:07 2019 |
| 203|42528809|42528809SU10027_M0 |T26_U10027_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 16|Tue Sep 17 16:41:47 2019 |

Found 13 RFC-Connections

CA Blocks
------------------------------------------------------------
1 WORKER 19804
2 GATEWAY 32117
3 WORKER 32120
333 WORKER 32120
337 INVALID -1
342 INVALID -1
343 INVALID -1
7 ca_blk slots of 6000 in use, 3 currently unowned (in request queues)

MPI Info Tue Sep 17 16:43:08 2019


------------------------------------------------------------
Current pipes in use: 11
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 16:43:08 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2460| 50| |
|
| 1|DDLOG | 2460| 50| |
|
| 2|BTCSCHED | 4922| 51| |
|
| 3|RESTART_ALL | 984| 108| |
|
| 4|ENVCHECK | 14768| 1| |
|
| 5|AUTOABAP | 984| 108| |
|
| 6|BGRFC_WATCHDOG | 985| 108| |
|
| 7|AUTOTH | 1003| 57| |
|
| 8|AUTOCCMS | 4922| 51| |
|
| 9|AUTOSECURITY | 4922| 51| |
|
| 10|LOAD_CALCULATION | 295060| 1| |
|
| 11|SPOOLALRM | 4923| 51| |
|
| 12|CALL_DELAYED | 0| 51| |
|
| 13|TIMEOUT | 0| 295|T39_U484_M0 |
29132700|
| 14|TIMEOUT | 0| 235|T44_U759_M0 |
29131096|
| 15|TIMEOUT | 0| 52|T24_U27368_M0 |
29131665|
| 16|TIMEOUT | 0| 52|T93_U27370_M0 |
29131867|

Found 17 periodic tasks

********** SERVER SNAPSHOT 1 (Reason: Workprocess 12 died / Time: Tue Sep 17


16:43:08 2019) - end **********

Tue Sep 17 16:43:08:401 2019


***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
DpWpDynCreate: created new work process W12-28799

Tue Sep 17 16:43:09:808 2019


*** ERROR => DpHdlDeadWp: W12 (pid 28799) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28799) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28799)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:43:13:576 2019


*** ERROR => DpHdlDeadWp: W1 (pid 32120) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W1 (pid = 32120)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-28830
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:43:15:080 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28830) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28830) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28830)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:43:29:808 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:43:49:808 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:44:08:403 2019


DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 29085

Tue Sep 17 16:44:09:808 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:44:15:173 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:44:29:809 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 29085 terminated

Tue Sep 17 16:44:49:810 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:45:09:811 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-29771
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-29772

Tue Sep 17 16:45:11:526 2019


*** ERROR => DpHdlDeadWp: W1 (pid 29771) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29771) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 29771)
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 29772) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29772) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 29772)
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:45:29:811 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:45:49:812 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:46:09:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:46:29:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:46:49:814 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:47:09:814 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:47:29:815 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:47:49:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:48:09:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:48:29:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot

Tue Sep 17 16:48:49:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:44:08 2019, skip new
snapshot
Tue Sep 17 16:49:09:818 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 2 (Reason: Workprocess 1 died / Time: Tue Sep 17


16:49:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 16:49:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 16:49:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10860207, readCount 10860207)


UPD : 0 (peak 31, writeCount 2227, readCount 2227)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080003, readCount 1080003)
SPO : 0 (peak 2, writeCount 10850, readCount 10850)
UP2 : 0 (peak 1, writeCount 1046, readCount 1046)
DISP: 0 (peak 67, writeCount 416722, readCount 416722)
GW : 0 (peak 45, writeCount 9924808, readCount 9924808)
ICM : 0 (peak 186, writeCount 196077, readCount 196077)
LWP : 2 (peak 15, writeCount 16648, readCount 16646)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 16:49:09 2019


------------------------------------------------------------

Current snapshot id: 2


DB clean time (in percent of total time) : 23.42 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |3 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |3 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 16:49:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |16:48:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T140_U793_M0 |001|SMD_RFC |smprd02.niladv.org |16:41:05|16 |
SAPMSSY1 |norm| |
| | 4249|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (2 entries) Tue Sep 17 16:49:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 57|58048941|58048941SU793_M0 |T140_U793_M0_I0 |ALLOCATED |
SERVER|SAP_SEND | 16|Tue Sep 17 16:41:05 2019 |
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 16:48:57 2019 |

Found 2 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 16:49:09 2019


------------------------------------------------------------
Current pipes in use: 115
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 16:49:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2463| 49| |
|
| 1|DDLOG | 2463| 49| |
|
| 2|BTCSCHED | 4928| 50| |
|
| 3|RESTART_ALL | 985| 47| |
|
| 4|ENVCHECK | 14787| 20| |
|
| 5|AUTOABAP | 985| 47| |
|
| 6|BGRFC_WATCHDOG | 986| 47| |
|
| 7|AUTOTH | 1009| 56| |
|
| 8|AUTOCCMS | 4928| 50| |
|
| 9|AUTOSECURITY | 4928| 50| |
|
| 10|LOAD_CALCULATION | 295420| 1| |
|
| 11|SPOOLALRM | 4929| 50| |
|
| 12|CALL_DELAYED | 0| 377| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 2 (Reason: Workprocess 1 died / Time: Tue Sep 17


16:49:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:49:29:819 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:49:49:819 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:50:09:819 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-31112
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-31113
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 31114

Tue Sep 17 16:50:11:507 2019


*** ERROR => DpHdlDeadWp: W1 (pid 31112) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31112) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 31112)
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 31113) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31113) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 31113)
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:50:15:581 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:50:29:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 31114 terminated

Tue Sep 17 16:50:49:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:51:09:821 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:51:29:821 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:51:49:821 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:52:09:822 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:52:29:823 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:52:49:823 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:53:09:824 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:53:29:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:53:49:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:54:09:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:54:29:826 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot

Tue Sep 17 16:54:49:827 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:50:09 2019, skip new
snapshot
Tue Sep 17 16:55:09:827 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 3 (Reason: Workprocess 1 died / Time: Tue Sep 17


16:55:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 16:55:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 16:55:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10860602, readCount 10860602)


UPD : 0 (peak 31, writeCount 2228, readCount 2228)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080007, readCount 1080007)
SPO : 0 (peak 2, writeCount 10863, readCount 10863)
UP2 : 0 (peak 1, writeCount 1047, readCount 1047)
DISP: 0 (peak 67, writeCount 416767, readCount 416767)
GW : 0 (peak 45, writeCount 9924955, readCount 9924955)
ICM : 0 (peak 186, writeCount 196104, readCount 196104)
LWP : 2 (peak 15, writeCount 16663, readCount 16661)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 16:55:09 2019


------------------------------------------------------------

Current snapshot id: 3


DB clean time (in percent of total time) : 23.43 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |4 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |4 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 16:55:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |16:54:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 16:55:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 16:54:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 16:55:09 2019


------------------------------------------------------------
Current pipes in use: 199
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 16:55:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2466| 49| |
|
| 1|DDLOG | 2466| 49| |
|
| 2|BTCSCHED | 4934| 50| |
|
| 3|RESTART_ALL | 987| 287| |
|
| 4|ENVCHECK | 14805| 20| |
|
| 5|AUTOABAP | 987| 287| |
|
| 6|BGRFC_WATCHDOG | 988| 287| |
|
| 7|AUTOTH | 1015| 56| |
|
| 8|AUTOCCMS | 4934| 50| |
|
| 9|AUTOSECURITY | 4934| 50| |
|
| 10|LOAD_CALCULATION | 295778| 0| |
|
| 11|SPOOLALRM | 4935| 50| |
|
| 12|CALL_DELAYED | 0| 17| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 3 (Reason: Workprocess 1 died / Time: Tue Sep 17


16:55:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-555
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:55:09:833 2019


DpWpDynCreate: created new work process W12-556

Tue Sep 17 16:55:11:540 2019


*** ERROR => DpHdlDeadWp: W1 (pid 555) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=555) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 555)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 556) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=556) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 556)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:55:29:828 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:55:49:829 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 16:56:09:829 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 828

Tue Sep 17 16:56:15:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
Tue Sep 17 16:56:29:830 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 828 terminated

Tue Sep 17 16:56:49:831 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:57:09:831 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:57:29:832 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:57:49:833 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:58:09:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:58:29:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:58:49:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:59:09:835 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:59:29:836 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 16:59:49:837 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 17:00:09:838 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-2402
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-2405

Tue Sep 17 17:00:11:609 2019


*** ERROR => DpHdlDeadWp: W1 (pid 2402) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2402) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 2402)
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 2405) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2405) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 2405)
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 17:00:29:838 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 17:00:49:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 16:56:09 2019, skip new
snapshot

Tue Sep 17 17:01:09:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 4 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:01:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:01:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:01:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2


Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10861010, readCount 10861010)


UPD : 0 (peak 31, writeCount 2229, readCount 2229)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080011, readCount 1080011)
SPO : 0 (peak 2, writeCount 10876, readCount 10876)
UP2 : 0 (peak 1, writeCount 1048, readCount 1048)
DISP: 0 (peak 67, writeCount 416811, readCount 416811)
GW : 0 (peak 45, writeCount 9925103, readCount 9925103)
ICM : 1 (peak 186, writeCount 196131, readCount 196130)
LWP : 0 (peak 15, writeCount 16678, readCount 16678)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:01:09 2019


------------------------------------------------------------

Current snapshot id: 4


DB clean time (in percent of total time) : 23.43 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |6 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |6 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:01:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T5_U22105_M0 |000| |SST-LAP-HP0002 |16:58:18|0 |
SAPMSYST |high| |
|SESSION_MA| 4202|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:00:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:01:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 17:00:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:01:09 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:01:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2469| 49| |
|
| 1|DDLOG | 2469| 49| |
|
| 2|BTCSCHED | 4940| 50| |
|
| 3|RESTART_ALL | 988| 227| |
|
| 4|ENVCHECK | 14823| 20| |
|
| 5|AUTOABAP | 988| 227| |
|
| 6|BGRFC_WATCHDOG | 989| 227| |
|
| 7|AUTOTH | 1021| 56| |
|
| 8|AUTOCCMS | 4940| 50| |
|
| 9|AUTOSECURITY | 4940| 50| |
|
| 10|LOAD_CALCULATION | 296137| 1| |
|
| 11|SPOOLALRM | 4941| 50| |
|
| 12|CALL_DELAYED | 0| 663| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 4 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:01:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:01:29:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:01:49:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:02:09:840 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 6898

Tue Sep 17 17:02:15:792 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:02:29:841 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 6898 terminated

Tue Sep 17 17:02:49:842 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:03:09:843 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:03:29:843 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:03:49:844 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:04:09:844 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:04:29:845 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:04:49:846 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:05:09:846 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-19174
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-19175

Tue Sep 17 17:05:11:570 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19174) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19174) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19174)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19175) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19175) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19175)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:05:29:847 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:05:49:848 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:06:09:848 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:06:29:849 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:06:49:849 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:02:09 2019, skip new
snapshot

Tue Sep 17 17:07:09:849 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 5 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:07:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:07:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:07:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10861527, readCount 10861527)


UPD : 0 (peak 31, writeCount 2230, readCount 2230)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080015, readCount 1080015)
SPO : 0 (peak 2, writeCount 10889, readCount 10889)
UP2 : 0 (peak 1, writeCount 1049, readCount 1049)
DISP: 0 (peak 67, writeCount 416851, readCount 416851)
GW : 0 (peak 45, writeCount 9925373, readCount 9925373)
ICM : 1 (peak 186, writeCount 196158, readCount 196157)
LWP : 0 (peak 15, writeCount 16693, readCount 16693)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:07:09 2019


------------------------------------------------------------

Current snapshot id: 5


DB clean time (in percent of total time) : 23.44 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |7 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |7 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:07:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T5_U22105_M0 |000| |SST-LAP-HP0002 |16:58:18|0 |
SAPMSYST |high| |
|SESSION_MA| 4202|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:06:57|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:07:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 17:06:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:07:09 2019


------------------------------------------------------------
Current pipes in use: 199
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:07:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2472| 49| |
|
| 1|DDLOG | 2472| 49| |
|
| 2|BTCSCHED | 4946| 50| |
|
| 3|RESTART_ALL | 989| 167| |
|
| 4|ENVCHECK | 14841| 20| |
|
| 5|AUTOABAP | 989| 167| |
|
| 6|BGRFC_WATCHDOG | 990| 167| |
|
| 7|AUTOTH | 1027| 56| |
|
| 8|AUTOCCMS | 4946| 50| |
|
| 9|AUTOSECURITY | 4946| 50| |
|
| 10|LOAD_CALCULATION | 296496| 1| |
|
| 11|SPOOLALRM | 4947| 50| |
|
| 12|CALL_DELAYED | 0| 303| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 5 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:07:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:07:29:849 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:07:49:850 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:08:09:851 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 32756

Tue Sep 17 17:08:15:122 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:08:29:851 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 32756 terminated
Tue Sep 17 17:08:49:851 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:09:09:852 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:09:29:853 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:09:49:854 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:10:09:855 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-1254
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-1255

Tue Sep 17 17:10:11:575 2019


*** ERROR => DpHdlDeadWp: W1 (pid 1254) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1254) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 1254)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 1255) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1255) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 1255)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:10:29:855 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:10:49:856 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:11:09:857 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:11:29:857 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:11:49:858 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:12:09:859 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:12:29:860 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:12:49:860 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:08:09 2019, skip new
snapshot

Tue Sep 17 17:13:09:861 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 6 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:13:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:13:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:13:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10861985, readCount 10861985)


UPD : 0 (peak 31, writeCount 2232, readCount 2232)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080023, readCount 1080023)
SPO : 0 (peak 2, writeCount 10903, readCount 10903)
UP2 : 0 (peak 1, writeCount 1051, readCount 1051)
DISP: 0 (peak 67, writeCount 416894, readCount 416894)
GW : 0 (peak 45, writeCount 9925557, readCount 9925557)
ICM : 0 (peak 186, writeCount 196187, readCount 196187)
LWP : 2 (peak 15, writeCount 16723, readCount 16721)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:13:09 2019


------------------------------------------------------------

Current snapshot id: 6


DB clean time (in percent of total time) : 23.45 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |8 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |8 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:13:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T5_U22105_M0 |000| |SST-LAP-HP0002 |16:58:18|0 |
SAPMSYST |high| |
|SESSION_MA| 4202|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:12:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:13:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 17:12:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:13:09 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:13:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2475| 49| |
|
| 1|DDLOG | 2475| 49| |
|
| 2|BTCSCHED | 4952| 50| |
|
| 3|RESTART_ALL | 990| 107| |
|
| 4|ENVCHECK | 14859| 20| |
|
| 5|AUTOABAP | 990| 107| |
|
| 6|BGRFC_WATCHDOG | 991| 107| |
|
| 7|AUTOTH | 1033| 56| |
|
| 8|AUTOCCMS | 4952| 50| |
|
| 9|AUTOSECURITY | 4952| 50| |
|
| 10|LOAD_CALCULATION | 296855| 1| |
|
| 11|SPOOLALRM | 4953| 50| |
|
| 12|CALL_DELAYED | 0| 191| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 6 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:13:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:13:29:861 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:13:49:861 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:14:09:862 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 2614

Tue Sep 17 17:14:15:921 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:14:29:863 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 2614 terminated

Tue Sep 17 17:14:49:864 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:15:09:865 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-3276
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-3277

Tue Sep 17 17:15:11:590 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3276) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3276) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3276)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3277) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3277) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 3277)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:15:29:865 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:15:49:866 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:16:09:866 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:16:29:867 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:16:49:867 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:17:09:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:17:29:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:17:49:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:18:09:869 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:18:29:870 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot

Tue Sep 17 17:18:49:870 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:14:09 2019, skip new
snapshot
Tue Sep 17 17:19:09:871 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 7 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:19:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:19:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:19:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10862401, readCount 10862401)


UPD : 0 (peak 31, writeCount 2233, readCount 2233)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080027, readCount 1080027)
SPO : 0 (peak 2, writeCount 10916, readCount 10916)
UP2 : 0 (peak 1, writeCount 1052, readCount 1052)
DISP: 0 (peak 67, writeCount 416938, readCount 416938)
GW : 1 (peak 45, writeCount 9925729, readCount 9925728)
ICM : 0 (peak 186, writeCount 196214, readCount 196214)
LWP : 2 (peak 15, writeCount 16738, readCount 16736)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:19:09 2019


------------------------------------------------------------

Current snapshot id: 7


DB clean time (in percent of total time) : 23.46 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |9 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |9 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:19:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:18:57|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:19:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 17:18:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:19:09 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:19:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2478| 49| |
|
| 1|DDLOG | 2478| 49| |
|
| 2|BTCSCHED | 4958| 50| |
|
| 3|RESTART_ALL | 991| 47| |
|
| 4|ENVCHECK | 14877| 20| |
|
| 5|AUTOABAP | 991| 47| |
|
| 6|BGRFC_WATCHDOG | 992| 47| |
|
| 7|AUTOTH | 1039| 56| |
|
| 8|AUTOCCMS | 4958| 50| |
|
| 9|AUTOSECURITY | 4958| 50| |
|
| 10|LOAD_CALCULATION | 297214| 1| |
|
| 11|SPOOLALRM | 4959| 50| |
|
| 12|CALL_DELAYED | 0| 1185| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 7 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:19:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:19:29:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:19:49:873 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:20:09:873 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-4853
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-4854
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 4855

Tue Sep 17 17:20:11:577 2019


*** ERROR => DpHdlDeadWp: W1 (pid 4853) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4853) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 4853)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 4854) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4854) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 4854)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:20:16:286 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:20:29:874 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 4855 terminated

Tue Sep 17 17:20:49:874 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:21:09:874 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:21:29:875 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:21:49:875 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:22:09:876 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:22:29:876 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
Tue Sep 17 17:22:49:876 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:23:09:877 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:23:29:877 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:23:49:878 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:24:09:878 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:24:29:878 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:24:49:879 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:20:09 2019, skip new
snapshot

Tue Sep 17 17:25:09:879 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
********** SERVER SNAPSHOT 8 (Reason: Workprocess 1 died / Time: Tue Sep 17
17:25:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:25:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:25:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10862921, readCount 10862921)


UPD : 0 (peak 31, writeCount 2234, readCount 2234)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080031, readCount 1080031)
SPO : 0 (peak 2, writeCount 10929, readCount 10929)
UP2 : 0 (peak 1, writeCount 1053, readCount 1053)
DISP: 0 (peak 67, writeCount 416983, readCount 416983)
GW : 0 (peak 45, writeCount 9926001, readCount 9926001)
ICM : 0 (peak 186, writeCount 196244, readCount 196244)
LWP : 2 (peak 15, writeCount 16753, readCount 16751)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:25:09 2019


------------------------------------------------------------

Current snapshot id: 8


DB clean time (in percent of total time) : 23.47 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |10 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_RUN | | |norm|T49_U23053_M0 |HTTP_NORM| | |
0|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |10 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:25:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T49_U23053_M0 |001|SM_EXTERN_WS|10.54.36.37 |17:25:09|2 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:24:57|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:25:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Tue Sep 17 17:24:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:25:09 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:25:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2481| 49| |
|
| 1|DDLOG | 2481| 49| |
|
| 2|BTCSCHED | 4964| 50| |
|
| 3|RESTART_ALL | 993| 287| |
|
| 4|ENVCHECK | 14895| 20| |
|
| 5|AUTOABAP | 993| 287| |
|
| 6|BGRFC_WATCHDOG | 994| 287| |
|
| 7|AUTOTH | 1045| 56| |
|
| 8|AUTOCCMS | 4964| 50| |
|
| 9|AUTOSECURITY | 4964| 50| |
|
| 10|LOAD_CALCULATION | 297572| 1| |
|
| 11|SPOOLALRM | 4965| 50| |
|
| 12|CALL_DELAYED | 0| 825| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 8 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:25:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-6666
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:25:09:885 2019


DpWpDynCreate: created new work process W12-6667

Tue Sep 17 17:25:11:589 2019


*** ERROR => DpHdlDeadWp: W1 (pid 6666) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6666) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 6666)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 6667) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6667) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 6667)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:25:29:879 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:25:49:880 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:26:09:880 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 6938

Tue Sep 17 17:26:16:125 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:26:29:881 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 6938 terminated

Tue Sep 17 17:26:49:882 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:27:09:882 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:27:29:883 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:27:49:883 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:28:09:884 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:28:29:884 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:28:49:884 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:29:09:885 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:29:29:886 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:29:49:886 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:30:09:887 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-8632
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-8633

Tue Sep 17 17:30:11:602 2019


*** ERROR => DpHdlDeadWp: W1 (pid 8632) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8632) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 8632)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 8633) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8633) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 8633)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:30:29:887 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:30:49:888 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:26:09 2019, skip new
snapshot

Tue Sep 17 17:31:09:888 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 9 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:31:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:31:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Tue Sep 17 17:31:09 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10863716, readCount 10863716)


UPD : 0 (peak 31, writeCount 2235, readCount 2235)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080035, readCount 1080035)
SPO : 0 (peak 2, writeCount 10942, readCount 10942)
UP2 : 0 (peak 1, writeCount 1054, readCount 1054)
DISP: 0 (peak 67, writeCount 417024, readCount 417024)
GW : 0 (peak 45, writeCount 9926521, readCount 9926521)
ICM : 1 (peak 186, writeCount 196271, readCount 196270)
LWP : 0 (peak 15, writeCount 16768, readCount 16768)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:31:09 2019


------------------------------------------------------------

Current snapshot id: 9


DB clean time (in percent of total time) : 23.48 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |12 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |12 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:31:09 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:30:57|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:31:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 17:30:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:31:09 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:31:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2484| 49| |
|
| 1|DDLOG | 2484| 49| |
|
| 2|BTCSCHED | 4970| 50| |
|
| 3|RESTART_ALL | 994| 227| |
|
| 4|ENVCHECK | 14913| 20| |
|
| 5|AUTOABAP | 994| 227| |
|
| 6|BGRFC_WATCHDOG | 995| 227| |
|
| 7|AUTOTH | 1051| 56| |
|
| 8|AUTOCCMS | 4970| 50| |
|
| 9|AUTOSECURITY | 4970| 50| |
|
| 10|LOAD_CALCULATION | 297931| 1| |
|
| 11|SPOOLALRM | 4971| 50| |
|
| 12|CALL_DELAYED | 0| 465| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 9 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:31:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:31:29:889 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:31:49:889 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:32:09:890 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9156

Tue Sep 17 17:32:16:282 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:32:29:890 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9156 terminated

Tue Sep 17 17:32:49:891 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:33:09:892 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:33:29:893 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:33:49:894 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:34:09:894 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:34:29:895 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:34:49:895 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:35:09:896 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-10266
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-10267

Tue Sep 17 17:35:11:659 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10266) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10266) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10266)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10267) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10267) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10267)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:35:29:897 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:35:49:897 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:36:09:898 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:36:29:899 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
Tue Sep 17 17:36:49:899 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:32:09 2019, skip new
snapshot

Tue Sep 17 17:37:09:900 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 10 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:37:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:37:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:37:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000


DIA : 0 (peak 291, writeCount 10864521, readCount 10864521)
UPD : 0 (peak 31, writeCount 2236, readCount 2236)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080039, readCount 1080039)
SPO : 0 (peak 2, writeCount 10955, readCount 10955)
UP2 : 0 (peak 1, writeCount 1055, readCount 1055)
DISP: 0 (peak 67, writeCount 417065, readCount 417065)
GW : 0 (peak 45, writeCount 9927073, readCount 9927073)
ICM : 1 (peak 186, writeCount 196298, readCount 196297)
LWP : 0 (peak 15, writeCount 16783, readCount 16783)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:37:09 2019


------------------------------------------------------------

Current snapshot id: 10


DB clean time (in percent of total time) : 23.48 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |13 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |13 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:37:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:36:57|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:37:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 17:36:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:37:09 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:37:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2487| 49| |
|
| 1|DDLOG | 2487| 49| |
|
| 2|BTCSCHED | 4976| 50| |
|
| 3|RESTART_ALL | 995| 167| |
|
| 4|ENVCHECK | 14931| 20| |
|
| 5|AUTOABAP | 995| 167| |
|
| 6|BGRFC_WATCHDOG | 996| 167| |
|
| 7|AUTOTH | 1057| 56| |
|
| 8|AUTOCCMS | 4976| 50| |
|
| 9|AUTOSECURITY | 4976| 50| |
|
| 10|LOAD_CALCULATION | 298290| 1| |
|
| 11|SPOOLALRM | 4977| 50| |
|
| 12|CALL_DELAYED | 0| 105| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 10 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:37:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:37:29:900 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:37:49:901 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:38:09:901 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 11325

Tue Sep 17 17:38:16:110 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:38:29:901 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 11325 terminated

Tue Sep 17 17:38:49:902 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:39:09:903 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:39:29:903 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:39:49:904 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:40:09:904 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-12311
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-12312

Tue Sep 17 17:40:11:622 2019


*** ERROR => DpHdlDeadWp: W1 (pid 12311) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12311) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 12311)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 12312) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12312) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 12312)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:40:29:904 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:40:49:904 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:41:09:905 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:41:29:906 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:41:49:906 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:42:09:907 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:42:29:907 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot

Tue Sep 17 17:42:49:908 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:38:09 2019, skip new
snapshot
Tue Sep 17 17:43:09:909 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 11 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:43:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:43:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:43:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10865345, readCount 10865345)


UPD : 0 (peak 31, writeCount 2238, readCount 2238)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080047, readCount 1080047)
SPO : 0 (peak 2, writeCount 10969, readCount 10969)
UP2 : 0 (peak 1, writeCount 1057, readCount 1057)
DISP: 0 (peak 67, writeCount 417126, readCount 417126)
GW : 0 (peak 45, writeCount 9927611, readCount 9927611)
ICM : 0 (peak 186, writeCount 196327, readCount 196327)
LWP : 2 (peak 15, writeCount 16813, readCount 16811)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:43:09 2019


------------------------------------------------------------

Current snapshot id: 11


DB clean time (in percent of total time) : 23.49 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0|31517 |DIA |WP_RUN | | |norm|T93_U24189_M0 |HTTP_NORM| | |
0|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |14 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |14 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:43:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:42:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|HTTP_NORMAL |T93_U24189_M0 |001|SM_EXTERN_WS|10.54.36.37 |17:43:09|0 |
SAPMHTTP |norm| |
| | 4590|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:43:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 17:42:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:43:09 2019


------------------------------------------------------------
Current pipes in use: 219
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:43:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2490| 49| |
|
| 1|DDLOG | 2490| 49| |
|
| 2|BTCSCHED | 4982| 50| |
|
| 3|RESTART_ALL | 996| 107| |
|
| 4|ENVCHECK | 14949| 20| |
|
| 5|AUTOABAP | 996| 107| |
|
| 6|BGRFC_WATCHDOG | 997| 107| |
|
| 7|AUTOTH | 1063| 56| |
|
| 8|AUTOCCMS | 4982| 50| |
|
| 9|AUTOSECURITY | 4982| 50| |
|
| 10|LOAD_CALCULATION | 298649| 1| |
|
| 11|SPOOLALRM | 4983| 50| |
|
| 12|CALL_DELAYED | 0| 11756| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 11 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:43:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:43:29:909 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:43:49:910 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:44:09:911 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 13370

Tue Sep 17 17:44:16:419 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:44:29:911 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 13370 terminated
Tue Sep 17 17:44:49:912 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:45:09:913 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-14008
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-14010

Tue Sep 17 17:45:11:624 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14008) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14008) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14008)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14010) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14010) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14010)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:45:29:914 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:45:49:914 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:46:09:915 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:46:29:915 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:46:49:916 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:47:09:916 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:47:29:917 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:47:49:917 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:48:09:917 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:48:29:917 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:48:49:918 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:44:09 2019, skip new
snapshot

Tue Sep 17 17:49:09:919 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 12 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:49:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:49:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:49:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10866135, readCount 10866135)


UPD : 0 (peak 31, writeCount 2239, readCount 2239)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080051, readCount 1080051)
SPO : 0 (peak 2, writeCount 10982, readCount 10982)
UP2 : 0 (peak 1, writeCount 1058, readCount 1058)
DISP: 0 (peak 67, writeCount 417166, readCount 417166)
GW : 0 (peak 45, writeCount 9928155, readCount 9928155)
ICM : 0 (peak 186, writeCount 196354, readCount 196354)
LWP : 2 (peak 15, writeCount 16828, readCount 16826)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:49:09 2019


------------------------------------------------------------

Current snapshot id: 12


DB clean time (in percent of total time) : 23.50 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |15 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |15 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:49:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:48:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:49:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 17:48:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:49:09 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:49:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2493| 49| |
|
| 1|DDLOG | 2493| 49| |
|
| 2|BTCSCHED | 4988| 50| |
|
| 3|RESTART_ALL | 997| 47| |
|
| 4|ENVCHECK | 14967| 20| |
|
| 5|AUTOABAP | 997| 47| |
|
| 6|BGRFC_WATCHDOG | 998| 47| |
|
| 7|AUTOTH | 1069| 56| |
|
| 8|AUTOCCMS | 4988| 50| |
|
| 9|AUTOSECURITY | 4988| 50| |
|
| 10|LOAD_CALCULATION | 299008| 1| |
|
| 11|SPOOLALRM | 4989| 50| |
|
| 12|CALL_DELAYED | 0| 11396| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 12 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:49:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:49:29:919 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:49:49:920 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:50:09:920 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-15356
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-15357
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 15358

Tue Sep 17 17:50:11:597 2019


*** ERROR => DpHdlDeadWp: W1 (pid 15356) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15356) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 15356)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 15357) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15357) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 15357)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:50:16:479 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:50:29:920 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 15358 terminated

Tue Sep 17 17:50:49:921 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:51:09:922 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:51:29:923 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:51:49:924 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:52:09:924 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:52:29:924 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
Tue Sep 17 17:52:49:925 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:53:09:925 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:53:29:925 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:53:49:926 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:54:09:927 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:54:29:927 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:54:49:927 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:50:09 2019, skip new
snapshot

Tue Sep 17 17:55:09:928 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
********** SERVER SNAPSHOT 13 (Reason: Workprocess 1 died / Time: Tue Sep 17
17:55:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 17:55:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 17:55:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10866904, readCount 10866904)


UPD : 0 (peak 31, writeCount 2240, readCount 2240)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080055, readCount 1080055)
SPO : 0 (peak 2, writeCount 10995, readCount 10995)
UP2 : 0 (peak 1, writeCount 1059, readCount 1059)
DISP: 0 (peak 67, writeCount 417211, readCount 417211)
GW : 1 (peak 45, writeCount 9928698, readCount 9928697)
ICM : 1 (peak 186, writeCount 196381, readCount 196380)
LWP : 2 (peak 15, writeCount 16843, readCount 16841)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <IcmanQueue> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 17:55:09 2019


------------------------------------------------------------

Current snapshot id: 13


DB clean time (in percent of total time) : 23.51 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |16 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |16 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 17:55:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |17:54:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 17:55:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 17:54:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 17:55:09 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 17:55:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2496| 49| |
|
| 1|DDLOG | 2496| 49| |
|
| 2|BTCSCHED | 4994| 50| |
|
| 3|RESTART_ALL | 999| 287| |
|
| 4|ENVCHECK | 14985| 20| |
|
| 5|AUTOABAP | 999| 287| |
|
| 6|BGRFC_WATCHDOG | 1000| 287| |
|
| 7|AUTOTH | 1075| 56| |
|
| 8|AUTOCCMS | 4994| 50| |
|
| 9|AUTOSECURITY | 4994| 50| |
|
| 10|LOAD_CALCULATION | 299366| 1| |
|
| 11|SPOOLALRM | 4995| 50| |
|
| 12|CALL_DELAYED | 0| 11036| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 13 (Reason: Workprocess 1 died / Time: Tue Sep 17


17:55:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-17044
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:55:09:934 2019


DpWpDynCreate: created new work process W12-17045

Tue Sep 17 17:55:11:428 2019


*** ERROR => DpHdlDeadWp: W1 (pid 17044) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17044) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 17044)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 17045) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17045) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 17045)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:55:29:928 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:55:49:929 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 17:56:09:930 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 17306

Tue Sep 17 17:56:16:151 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
Tue Sep 17 17:56:29:931 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 17306 terminated

Tue Sep 17 17:56:41:263 2019


*** ERROR => NiIRead: invalid data (0xfffd03ff/0x8800;mode=0;hdl
51;peer=178.73.215.171:30556;local=3200) [nixxi.cpp 5226]

Tue Sep 17 17:56:49:931 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:57:09:932 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:57:29:932 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:57:49:933 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:58:09:933 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:58:29:934 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:58:49:934 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:59:09:934 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:59:29:935 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 17:59:49:935 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 18:00:09:936 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-18904
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-18905

Tue Sep 17 18:00:11:696 2019


*** ERROR => DpHdlDeadWp: W1 (pid 18904) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18904) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 18904)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 18905) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18905) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18905)
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
Tue Sep 17 18:00:29:936 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 18:00:49:937 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 17:56:09 2019, skip new
snapshot

Tue Sep 17 18:01:09:938 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 14 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:01:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:01:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Tue Sep 17 18:01:09 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10867708, readCount 10867708)


UPD : 0 (peak 31, writeCount 2241, readCount 2241)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080059, readCount 1080059)
SPO : 0 (peak 2, writeCount 11008, readCount 11008)
UP2 : 0 (peak 1, writeCount 1060, readCount 1060)
DISP: 0 (peak 67, writeCount 417252, readCount 417252)
GW : 0 (peak 45, writeCount 9929226, readCount 9929226)
ICM : 0 (peak 186, writeCount 196408, readCount 196408)
LWP : 0 (peak 15, writeCount 16858, readCount 16858)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:01:09 2019


------------------------------------------------------------

Current snapshot id: 14


DB clean time (in percent of total time) : 23.52 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |18 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |18 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:01:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:00:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:01:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 18:00:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:01:09 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:01:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2499| 49| |
|
| 1|DDLOG | 2499| 49| |
|
| 2|BTCSCHED | 5000| 50| |
|
| 3|RESTART_ALL | 1000| 227| |
|
| 4|ENVCHECK | 15003| 20| |
|
| 5|AUTOABAP | 1000| 227| |
|
| 6|BGRFC_WATCHDOG | 1001| 227| |
|
| 7|AUTOTH | 1081| 56| |
|
| 8|AUTOCCMS | 5000| 50| |
|
| 9|AUTOSECURITY | 5000| 50| |
|
| 10|LOAD_CALCULATION | 299724| 1| |
|
| 11|SPOOLALRM | 5001| 50| |
|
| 12|CALL_DELAYED | 0| 10676| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 14 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:01:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:01:29:938 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:01:49:938 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:02:09:939 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22279

Tue Sep 17 18:02:16:313 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:02:29:939 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22279 terminated

Tue Sep 17 18:02:49:940 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:03:09:940 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:03:29:941 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:03:49:942 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:04:09:942 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:04:29:943 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:04:49:943 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:05:09:943 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-2155
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-2156

Tue Sep 17 18:05:11:656 2019


*** ERROR => DpHdlDeadWp: W1 (pid 2155) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2155) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 2155)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 2156) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2156) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 2156)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:05:29:943 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:05:49:944 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:06:09:945 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:06:29:945 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:06:49:946 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:02:09 2019, skip new
snapshot

Tue Sep 17 18:07:09:946 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 15 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:07:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:07:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:07:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10868725, readCount 10868725)


UPD : 0 (peak 31, writeCount 2242, readCount 2242)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080063, readCount 1080063)
SPO : 0 (peak 2, writeCount 11021, readCount 11021)
UP2 : 0 (peak 1, writeCount 1061, readCount 1061)
DISP: 0 (peak 67, writeCount 417293, readCount 417293)
GW : 0 (peak 45, writeCount 9929986, readCount 9929986)
ICM : 0 (peak 186, writeCount 196435, readCount 196435)
LWP : 0 (peak 15, writeCount 16873, readCount 16873)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:07:09 2019


------------------------------------------------------------

Current snapshot id: 15


DB clean time (in percent of total time) : 23.53 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |19 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |19 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:07:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:06:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:07:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 18:06:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:07:09 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:07:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2502| 49| |
|
| 1|DDLOG | 2502| 49| |
|
| 2|BTCSCHED | 5006| 50| |
|
| 3|RESTART_ALL | 1001| 167| |
|
| 4|ENVCHECK | 15021| 20| |
|
| 5|AUTOABAP | 1001| 167| |
|
| 6|BGRFC_WATCHDOG | 1002| 167| |
|
| 7|AUTOTH | 1087| 56| |
|
| 8|AUTOCCMS | 5006| 50| |
|
| 9|AUTOSECURITY | 5006| 50| |
|
| 10|LOAD_CALCULATION | 300083| 1| |
|
| 11|SPOOLALRM | 5007| 50| |
|
| 12|CALL_DELAYED | 0| 10316| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 15 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:07:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:07:29:947 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:07:49:948 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:08:09:948 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 16670

Tue Sep 17 18:08:16:604 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:08:29:948 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 16670 terminated

Tue Sep 17 18:08:49:949 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:09:09:950 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:09:29:951 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:09:49:951 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:10:09:952 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-17855
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-17857

Tue Sep 17 18:10:11:426 2019


*** ERROR => DpHdlDeadWp: W1 (pid 17855) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17855) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 17855)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 17857) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17857) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 17857)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:10:29:952 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
Tue Sep 17 18:10:49:952 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:11:09:953 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:11:29:953 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:11:49:954 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:12:09:954 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:12:29:954 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:12:49:956 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:08:09 2019, skip new
snapshot

Tue Sep 17 18:13:09:956 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
********** SERVER SNAPSHOT 16 (Reason: Workprocess 1 died / Time: Tue Sep 17
18:13:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:13:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:13:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10869549, readCount 10869549)


UPD : 0 (peak 31, writeCount 2244, readCount 2244)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080071, readCount 1080071)
SPO : 0 (peak 2, writeCount 11035, readCount 11035)
UP2 : 0 (peak 1, writeCount 1063, readCount 1063)
DISP: 0 (peak 67, writeCount 417334, readCount 417334)
GW : 0 (peak 45, writeCount 9930568, readCount 9930568)
ICM : 0 (peak 186, writeCount 196464, readCount 196464)
LWP : 2 (peak 15, writeCount 16903, readCount 16901)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:13:09 2019


------------------------------------------------------------

Current snapshot id: 16


DB clean time (in percent of total time) : 23.54 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |20 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |20 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:13:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:12:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
Found 3 logons with 3 sessions
Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:13:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 18:12:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:13:09 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:13:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2505| 49| |
|
| 1|DDLOG | 2505| 49| |
|
| 2|BTCSCHED | 5012| 50| |
|
| 3|RESTART_ALL | 1002| 107| |
|
| 4|ENVCHECK | 15039| 20| |
|
| 5|AUTOABAP | 1002| 107| |
|
| 6|BGRFC_WATCHDOG | 1003| 107| |
|
| 7|AUTOTH | 1093| 56| |
|
| 8|AUTOCCMS | 5012| 50| |
|
| 9|AUTOSECURITY | 5012| 50| |
|
| 10|LOAD_CALCULATION | 300441| 1| |
|
| 11|SPOOLALRM | 5013| 50| |
|
| 12|CALL_DELAYED | 0| 9956| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 16 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:13:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:13:29:957 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:13:49:958 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:14:09:959 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 18921

Tue Sep 17 18:14:16:482 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:14:29:959 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 18921 terminated

Tue Sep 17 18:14:49:960 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:15:09:960 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-19560
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-19561

Tue Sep 17 18:15:11:666 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19560) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19560) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19560)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19561) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19561) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19561)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:15:29:961 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:15:49:962 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:16:09:963 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:16:29:963 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:16:49:964 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:17:09:965 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:17:29:965 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:17:49:966 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:18:09:967 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:18:29:967 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:18:49:968 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:14:09 2019, skip new
snapshot

Tue Sep 17 18:19:09:969 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 17 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:19:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:19:09 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:19:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10870439, readCount 10870439)


UPD : 0 (peak 31, writeCount 2245, readCount 2245)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080075, readCount 1080075)
SPO : 0 (peak 2, writeCount 11048, readCount 11048)
UP2 : 0 (peak 1, writeCount 1064, readCount 1064)
DISP: 0 (peak 67, writeCount 417375, readCount 417375)
GW : 0 (peak 45, writeCount 9931196, readCount 9931196)
ICM : 0 (peak 186, writeCount 196491, readCount 196491)
LWP : 2 (peak 15, writeCount 16918, readCount 16916)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:19:09 2019


------------------------------------------------------------

Current snapshot id: 17


DB clean time (in percent of total time) : 23.55 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |21 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |21 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:19:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:18:57|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB
RFC-Connection Table (1 entries) Tue Sep 17 18:19:09 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 18:18:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:19:09 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:19:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2508| 49| |
|
| 1|DDLOG | 2508| 49| |
|
| 2|BTCSCHED | 5018| 50| |
|
| 3|RESTART_ALL | 1003| 47| |
|
| 4|ENVCHECK | 15057| 20| |
|
| 5|AUTOABAP | 1003| 47| |
|
| 6|BGRFC_WATCHDOG | 1004| 47| |
|
| 7|AUTOTH | 1099| 56| |
|
| 8|AUTOCCMS | 5018| 50| |
|
| 9|AUTOSECURITY | 5018| 50| |
|
| 10|LOAD_CALCULATION | 300800| 1| |
|
| 11|SPOOLALRM | 5019| 50| |
|
| 12|CALL_DELAYED | 0| 9596| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 17 (Reason: Workprocess 1 died / Time: Tue Sep 17
18:19:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:19:29:970 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:19:49:970 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:20:09:971 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-21197
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-21198
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 21199

Tue Sep 17 18:20:11:639 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21197) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21197) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21197)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21198) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21198) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21198)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:20:16:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:20:29:972 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 21199 terminated

Tue Sep 17 18:20:49:973 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:21:09:973 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:21:29:974 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:21:49:974 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:22:09:975 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:22:29:976 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:22:49:976 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:23:09:977 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:23:29:978 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:23:49:978 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:24:09:979 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:24:29:979 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:24:49:980 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:20:09 2019, skip new
snapshot

Tue Sep 17 18:25:09:980 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 18 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:25:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:25:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:25:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10871383, readCount 10871383)


UPD : 0 (peak 31, writeCount 2246, readCount 2246)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080079, readCount 1080079)
SPO : 0 (peak 2, writeCount 11061, readCount 11061)
UP2 : 0 (peak 1, writeCount 1065, readCount 1065)
DISP: 0 (peak 67, writeCount 417420, readCount 417420)
GW : 0 (peak 45, writeCount 9931875, readCount 9931875)
ICM : 1 (peak 186, writeCount 196520, readCount 196519)
LWP : 2 (peak 15, writeCount 16933, readCount 16931)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:25:09 2019


------------------------------------------------------------

Current snapshot id: 18


DB clean time (in percent of total time) : 23.56 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |22 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |22 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:25:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:24:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:25:09 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 18:24:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:25:09 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:25:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2511| 49| |
|
| 1|DDLOG | 2511| 49| |
|
| 2|BTCSCHED | 5024| 50| |
|
| 3|RESTART_ALL | 1005| 287| |
|
| 4|ENVCHECK | 15075| 20| |
|
| 5|AUTOABAP | 1005| 287| |
|
| 6|BGRFC_WATCHDOG | 1006| 287| |
|
| 7|AUTOTH | 1105| 56| |
|
| 8|AUTOCCMS | 5024| 50| |
|
| 9|AUTOSECURITY | 5024| 50| |
|
| 10|LOAD_CALCULATION | 301158| 1| |
|
| 11|SPOOLALRM | 5025| 50| |
|
| 12|CALL_DELAYED | 0| 9236| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 18 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:25:09 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
DpWpDynCreate: created new work process W1-22896
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:25:09:986 2019


DpWpDynCreate: created new work process W12-22898

Tue Sep 17 18:25:11:692 2019


*** ERROR => DpHdlDeadWp: W1 (pid 22896) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22896) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 22896)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 22898) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22898) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 22898)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:25:29:981 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:25:49:981 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:26:09:982 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23176

Tue Sep 17 18:26:16:601 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:26:29:982 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 23176 terminated

Tue Sep 17 18:26:49:982 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:27:09:983 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:27:29:983 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:27:49:984 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:28:09:985 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:28:29:985 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:28:49:986 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:29:09:987 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:29:29:987 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:29:49:987 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:30:09:988 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-24971
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-24972

Tue Sep 17 18:30:11:581 2019


*** ERROR => DpHdlDeadWp: W1 (pid 24971) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24971) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 24971)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 24972) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24972) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 24972)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:30:29:989 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot

Tue Sep 17 18:30:49:990 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:26:09 2019, skip new
snapshot
Tue Sep 17 18:31:09:990 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 19 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:31:09 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:31:09 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:31:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10872184, readCount 10872184)


UPD : 0 (peak 31, writeCount 2247, readCount 2247)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080083, readCount 1080083)
SPO : 0 (peak 2, writeCount 11074, readCount 11074)
UP2 : 0 (peak 1, writeCount 1066, readCount 1066)
DISP: 0 (peak 67, writeCount 417460, readCount 417460)
GW : 0 (peak 45, writeCount 9932407, readCount 9932407)
ICM : 1 (peak 186, writeCount 196547, readCount 196546)
LWP : 0 (peak 15, writeCount 16948, readCount 16948)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:31:09 2019


------------------------------------------------------------

Current snapshot id: 19


DB clean time (in percent of total time) : 23.57 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |24 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |24 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:31:09 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:30:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:31:09 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 18:30:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:31:09 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:31:09 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2514| 49| |
|
| 1|DDLOG | 2514| 49| |
|
| 2|BTCSCHED | 5030| 50| |
|
| 3|RESTART_ALL | 1006| 227| |
|
| 4|ENVCHECK | 15093| 20| |
|
| 5|AUTOABAP | 1006| 227| |
|
| 6|BGRFC_WATCHDOG | 1007| 227| |
|
| 7|AUTOTH | 1111| 56| |
|
| 8|AUTOCCMS | 5030| 50| |
|
| 9|AUTOSECURITY | 5030| 50| |
|
| 10|LOAD_CALCULATION | 301516| 1| |
|
| 11|SPOOLALRM | 5031| 50| |
|
| 12|CALL_DELAYED | 0| 8876| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 19 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:31:09 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:31:29:991 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:31:49:992 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:32:09:993 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 25531

Tue Sep 17 18:32:16:468 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:32:29:993 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 25531 terminated

Tue Sep 17 18:32:49:993 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
Tue Sep 17 18:33:09:993 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:33:29:994 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:33:49:994 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:34:09:995 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:34:29:995 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:34:49:995 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:35:09:995 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-26690
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-26691

Tue Sep 17 18:35:11:703 2019


*** ERROR => DpHdlDeadWp: W1 (pid 26690) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26690) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 26690)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 26691) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26691) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 26691)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:35:29:996 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:35:49:996 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:36:09:997 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:36:29:998 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:36:49:998 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:32:09 2019, skip new
snapshot

Tue Sep 17 18:37:09:999 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 20 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:37:09 2019) - begin **********
Server smprd02_SMP_00, Tue Sep 17 18:37:09 2019

Tue Sep 17 18:37:10:000 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:37:09 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10873006, readCount 10873006)


UPD : 0 (peak 31, writeCount 2248, readCount 2248)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080087, readCount 1080087)
SPO : 0 (peak 2, writeCount 11087, readCount 11087)
UP2 : 0 (peak 1, writeCount 1067, readCount 1067)
DISP: 0 (peak 67, writeCount 417501, readCount 417501)
GW : 0 (peak 45, writeCount 9932967, readCount 9932967)
ICM : 0 (peak 186, writeCount 196574, readCount 196574)
LWP : 0 (peak 15, writeCount 16963, readCount 16963)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:37:10 2019


------------------------------------------------------------

Current snapshot id: 20


DB clean time (in percent of total time) : 23.58 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |25 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |25 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:36:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:37:10 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 18:36:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:37:10 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2517| 48| |
|
| 1|DDLOG | 2517| 48| |
|
| 2|BTCSCHED | 5036| 49| |
|
| 3|RESTART_ALL | 1007| 166| |
|
| 4|ENVCHECK | 15111| 19| |
|
| 5|AUTOABAP | 1007| 166| |
|
| 6|BGRFC_WATCHDOG | 1008| 166| |
|
| 7|AUTOTH | 1117| 55| |
|
| 8|AUTOCCMS | 5036| 49| |
|
| 9|AUTOSECURITY | 5036| 49| |
|
| 10|LOAD_CALCULATION | 301875| 0| |
|
| 11|SPOOLALRM | 5037| 49| |
|
| 12|CALL_DELAYED | 0| 8515| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 20 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:37:30:001 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:37:50:001 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:38:10:001 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 27698

Tue Sep 17 18:38:17:154 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:38:30:010 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 27698 terminated

Tue Sep 17 18:38:50:011 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:39:10:011 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
Tue Sep 17 18:39:30:011 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:39:50:012 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:40:10:012 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-28489
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-28490

Tue Sep 17 18:40:11:727 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28489) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28489) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28489)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 28490) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28490) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28490)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:40:30:012 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:40:50:013 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:41:10:013 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:41:30:014 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:41:50:014 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:42:10:015 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:42:30:015 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:42:50:016 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:38:09 2019, skip new
snapshot

Tue Sep 17 18:43:10:017 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 21 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:43:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10873794, readCount 10873794)


UPD : 0 (peak 31, writeCount 2250, readCount 2250)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080095, readCount 1080095)
SPO : 0 (peak 2, writeCount 11101, readCount 11101)
UP2 : 0 (peak 1, writeCount 1069, readCount 1069)
DISP: 0 (peak 67, writeCount 417542, readCount 417542)
GW : 0 (peak 45, writeCount 9933501, readCount 9933501)
ICM : 0 (peak 186, writeCount 196603, readCount 196603)
LWP : 2 (peak 15, writeCount 16993, readCount 16991)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:


Queue <DispatcherQueue> in slot 0 (port=18802) has no requests
Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:43:10 2019


------------------------------------------------------------

Current snapshot id: 21


DB clean time (in percent of total time) : 23.58 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |26 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 3|19909 |DIA |WP_RUN | | |norm|T5_U28187_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |26 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T5_U28187_M0 |001|SM_EXTERN_WS|10.54.36.37 |18:43:09|3 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:42:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 18:42:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:43:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2520| 48| |
|
| 1|DDLOG | 2520| 48| |
|
| 2|BTCSCHED | 5042| 49| |
|
| 3|RESTART_ALL | 1008| 106| |
|
| 4|ENVCHECK | 15129| 20| |
|
| 5|AUTOABAP | 1008| 106| |
|
| 6|BGRFC_WATCHDOG | 1009| 106| |
|
| 7|AUTOTH | 1123| 55| |
|
| 8|AUTOCCMS | 5042| 49| |
|
| 9|AUTOSECURITY | 5042| 49| |
|
| 10|LOAD_CALCULATION | 302234| 0| |
|
| 11|SPOOLALRM | 5043| 49| |
|
| 12|CALL_DELAYED | 0| 8155| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 21 (Reason: Workprocess 1 died / Time: Tue Sep 17
18:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:43:30:017 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:43:50:017 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:44:10:017 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 29574

Tue Sep 17 18:44:16:692 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:44:30:022 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 29574 terminated

Tue Sep 17 18:44:50:022 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:45:10:023 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-30209
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-30210

Tue Sep 17 18:45:11:461 2019


*** ERROR => DpHdlDeadWp: W1 (pid 30209) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30209) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 30209)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 30210) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30210) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 30210)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:45:30:023 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:45:50:023 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:46:10:024 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:46:30:024 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:46:50:024 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:47:10:024 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:47:30:025 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:47:50:026 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:48:10:027 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:48:30:027 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:48:50:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:44:10 2019, skip new
snapshot

Tue Sep 17 18:49:10:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 22 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:49:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10874597, readCount 10874597)


UPD : 0 (peak 31, writeCount 2251, readCount 2251)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080099, readCount 1080099)
SPO : 0 (peak 2, writeCount 11114, readCount 11114)
UP2 : 0 (peak 1, writeCount 1070, readCount 1070)
DISP: 0 (peak 67, writeCount 417583, readCount 417583)
GW : 1 (peak 45, writeCount 9934041, readCount 9934040)
ICM : 1 (peak 186, writeCount 196630, readCount 196629)
LWP : 2 (peak 15, writeCount 17008, readCount 17006)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:49:10 2019


------------------------------------------------------------

Current snapshot id: 22


DB clean time (in percent of total time) : 23.59 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |27 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |27 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:48:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 18:48:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:49:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2523| 48| |
|
| 1|DDLOG | 2523| 48| |
|
| 2|BTCSCHED | 5048| 49| |
|
| 3|RESTART_ALL | 1009| 46| |
|
| 4|ENVCHECK | 15147| 20| |
|
| 5|AUTOABAP | 1009| 46| |
|
| 6|BGRFC_WATCHDOG | 1010| 46| |
|
| 7|AUTOTH | 1129| 56| |
|
| 8|AUTOCCMS | 5048| 49| |
|
| 9|AUTOSECURITY | 5048| 49| |
|
| 10|LOAD_CALCULATION | 302592| 0| |
|
| 11|SPOOLALRM | 5049| 49| |
|
| 12|CALL_DELAYED | 0| 7795| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 22 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:49:30:029 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:49:50:030 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:50:10:030 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-31707
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-31708
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 31709

Tue Sep 17 18:50:11:765 2019


*** ERROR => DpHdlDeadWp: W1 (pid 31707) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31707) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 31707)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 31708) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31708) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 31708)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:50:16:974 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:50:30:030 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 31709 terminated

Tue Sep 17 18:50:50:031 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:51:10:031 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:51:30:032 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:51:50:032 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:52:10:033 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:52:30:033 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:52:50:033 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:53:10:034 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
Tue Sep 17 18:53:30:034 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:53:50:034 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:54:10:035 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:54:30:036 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:54:50:036 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:50:10 2019, skip new
snapshot

Tue Sep 17 18:55:10:037 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 23 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:55:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 18:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 18:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10875377, readCount 10875377)


UPD : 0 (peak 31, writeCount 2252, readCount 2252)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080103, readCount 1080103)
SPO : 0 (peak 2, writeCount 11127, readCount 11127)
UP2 : 0 (peak 1, writeCount 1071, readCount 1071)
DISP: 0 (peak 67, writeCount 417628, readCount 417628)
GW : 0 (peak 45, writeCount 9934584, readCount 9934584)
ICM : 0 (peak 186, writeCount 196657, readCount 196657)
LWP : 2 (peak 15, writeCount 17023, readCount 17021)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 18:55:10 2019


------------------------------------------------------------

Current snapshot id: 23


DB clean time (in percent of total time) : 23.60 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |28 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 3|19909 |DIA |WP_RUN | | |norm|T32_U28936_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |28 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 18:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T32_U28936_M0 |001|SM_EXTERN_WS|10.54.36.37 |18:55:09|3 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |18:54:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 18:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 18:54:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 18:55:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 18:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2526| 48| |
|
| 1|DDLOG | 2526| 48| |
|
| 2|BTCSCHED | 5054| 49| |
|
| 3|RESTART_ALL | 1011| 286| |
|
| 4|ENVCHECK | 15165| 20| |
|
| 5|AUTOABAP | 1011| 286| |
|
| 6|BGRFC_WATCHDOG | 1012| 286| |
|
| 7|AUTOTH | 1135| 56| |
|
| 8|AUTOCCMS | 5054| 49| |
|
| 9|AUTOSECURITY | 5054| 49| |
|
| 10|LOAD_CALCULATION | 302951| 0| |
|
| 11|SPOOLALRM | 5055| 49| |
|
| 12|CALL_DELAYED | 0| 7435| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 23 (Reason: Workprocess 1 died / Time: Tue Sep 17


18:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-1039
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:55:10:043 2019


DpWpDynCreate: created new work process W12-1040

Tue Sep 17 18:55:11:815 2019


*** ERROR => DpHdlDeadWp: W1 (pid 1039) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1039) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 1039)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 1040) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1040) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 1040)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:55:30:038 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:55:50:038 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 18:56:10:038 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 1411

Tue Sep 17 18:56:17:224 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:56:30:039 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 1411 terminated

Tue Sep 17 18:56:50:040 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:57:10:040 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:57:30:041 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:57:50:041 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:58:10:041 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:58:30:042 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:58:50:042 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:59:10:043 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:59:30:044 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 18:59:50:044 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 19:00:10:045 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-3186
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-3187

Tue Sep 17 19:00:11:816 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3186) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3186) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3186)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3187) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3187) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 3187)
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 19:00:30:045 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 19:00:50:045 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 18:56:10 2019, skip new
snapshot

Tue Sep 17 19:01:10:046 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 24 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:01:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10876207, readCount 10876207)


UPD : 0 (peak 31, writeCount 2253, readCount 2253)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080107, readCount 1080107)
SPO : 0 (peak 2, writeCount 11140, readCount 11140)
UP2 : 0 (peak 1, writeCount 1072, readCount 1072)
DISP: 0 (peak 67, writeCount 417669, readCount 417669)
GW : 0 (peak 45, writeCount 9935128, readCount 9935128)
ICM : 0 (peak 186, writeCount 196684, readCount 196684)
LWP : 0 (peak 15, writeCount 17038, readCount 17038)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:01:10 2019


------------------------------------------------------------

Current snapshot id: 24


DB clean time (in percent of total time) : 23.61 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |30 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |30 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:00:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 19:00:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:01:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2529| 48| |
|
| 1|DDLOG | 2529| 48| |
|
| 2|BTCSCHED | 5060| 49| |
|
| 3|RESTART_ALL | 1012| 226| |
|
| 4|ENVCHECK | 15183| 20| |
|
| 5|AUTOABAP | 1012| 226| |
|
| 6|BGRFC_WATCHDOG | 1013| 226| |
|
| 7|AUTOTH | 1141| 56| |
|
| 8|AUTOCCMS | 5060| 49| |
|
| 9|AUTOSECURITY | 5060| 49| |
|
| 10|LOAD_CALCULATION | 303310| 0| |
|
| 11|SPOOLALRM | 5061| 49| |
|
| 12|CALL_DELAYED | 0| 7075| |
|
Found 13 periodic tasks

********** SERVER SNAPSHOT 24 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:01:30:047 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:01:50:047 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:02:10:047 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 3639

Tue Sep 17 19:02:17:519 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:02:30:047 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 3639 terminated

Tue Sep 17 19:02:50:048 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:03:10:048 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:03:30:048 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:03:50:048 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:04:10:049 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:04:30:051 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:04:50:051 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:05:10:052 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-7477
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-7478

Tue Sep 17 19:05:11:760 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7477) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7477) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7477)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7478) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7478) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7478)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:05:30:053 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:05:50:054 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:06:10:054 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:06:30:054 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:06:50:054 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:02:10 2019, skip new
snapshot

Tue Sep 17 19:07:10:055 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 25 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:07:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:07:10 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10877225, readCount 10877225)


UPD : 0 (peak 31, writeCount 2254, readCount 2254)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080111, readCount 1080111)
SPO : 0 (peak 2, writeCount 11153, readCount 11153)
UP2 : 0 (peak 1, writeCount 1073, readCount 1073)
DISP: 0 (peak 67, writeCount 417710, readCount 417710)
GW : 0 (peak 45, writeCount 9935888, readCount 9935888)
ICM : 0 (peak 186, writeCount 196711, readCount 196711)
LWP : 0 (peak 15, writeCount 17053, readCount 17053)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:


Queue <DispatcherQueue> in slot 0 (port=18802) has no requests
Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:07:10 2019


------------------------------------------------------------

Current snapshot id: 25


DB clean time (in percent of total time) : 23.62 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |31 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |31 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:06:57|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Tue Sep 17 19:06:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:07:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2532| 48| |
|
| 1|DDLOG | 2532| 48| |
|
| 2|BTCSCHED | 5066| 49| |
|
| 3|RESTART_ALL | 1013| 166| |
|
| 4|ENVCHECK | 15201| 20| |
|
| 5|AUTOABAP | 1013| 166| |
|
| 6|BGRFC_WATCHDOG | 1014| 166| |
|
| 7|AUTOTH | 1147| 56| |
|
| 8|AUTOCCMS | 5066| 49| |
|
| 9|AUTOSECURITY | 5066| 49| |
|
| 10|LOAD_CALCULATION | 303669| 0| |
|
| 11|SPOOLALRM | 5067| 49| |
|
| 12|CALL_DELAYED | 0| 6715| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 25 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 19:07:30:055 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:07:50:056 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:08:10:056 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 19526

Tue Sep 17 19:08:17:094 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:08:30:057 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 19526 terminated

Tue Sep 17 19:08:50:057 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:09:10:057 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:09:30:057 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:09:50:058 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:10:10:059 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-25989
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-25990

Tue Sep 17 19:10:11:796 2019


*** ERROR => DpHdlDeadWp: W1 (pid 25989) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25989) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25989)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 25990) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25990) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 25990)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:10:30:059 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:10:50:059 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:11:10:060 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
Tue Sep 17 19:11:30:061 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:11:50:061 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:12:10:061 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:12:30:061 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:12:50:062 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:08:10 2019, skip new
snapshot

Tue Sep 17 19:13:10:063 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 26 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:13:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10878068, readCount 10878068)


UPD : 0 (peak 31, writeCount 2256, readCount 2256)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080119, readCount 1080119)
SPO : 0 (peak 2, writeCount 11167, readCount 11167)
UP2 : 0 (peak 1, writeCount 1075, readCount 1075)
DISP: 0 (peak 67, writeCount 417750, readCount 417750)
GW : 0 (peak 45, writeCount 9936474, readCount 9936474)
ICM : 0 (peak 186, writeCount 196740, readCount 196740)
LWP : 2 (peak 15, writeCount 17083, readCount 17081)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests
Workprocess Table (long) Tue Sep 17 19:13:10 2019
------------------------------------------------------------

Current snapshot id: 26


DB clean time (in percent of total time) : 23.63 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0|31517 |DIA |WP_RUN | | |norm|T39_U30200_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |32 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |32 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T39_U30200_M0 |001|SM_EXTERN_WS|10.54.36.37 |19:13:09|0 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:12:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 19:12:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:13:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2535| 48| |
|
| 1|DDLOG | 2535| 48| |
|
| 2|BTCSCHED | 5072| 49| |
|
| 3|RESTART_ALL | 1014| 106| |
|
| 4|ENVCHECK | 15219| 20| |
|
| 5|AUTOABAP | 1014| 106| |
|
| 6|BGRFC_WATCHDOG | 1015| 106| |
|
| 7|AUTOTH | 1153| 56| |
|
| 8|AUTOCCMS | 5072| 49| |
|
| 9|AUTOSECURITY | 5072| 49| |
|
| 10|LOAD_CALCULATION | 304028| 0| |
|
| 11|SPOOLALRM | 5073| 49| |
|
| 12|CALL_DELAYED | 0| 6355| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 26 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:13:30:064 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:13:50:065 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:14:10:066 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 3248

Tue Sep 17 19:14:17:563 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:14:30:066 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 3248 terminated

Tue Sep 17 19:14:50:067 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:15:10:067 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-4036
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-4037

Tue Sep 17 19:15:11:777 2019


*** ERROR => DpHdlDeadWp: W1 (pid 4036) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4036) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 4036)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 4037) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4037) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 4037)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:15:30:068 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:15:50:068 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:16:10:069 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:16:30:070 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:16:50:071 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:17:10:072 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
Tue Sep 17 19:17:30:072 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:17:50:072 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:18:10:073 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:18:30:073 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:18:50:074 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:14:10 2019, skip new
snapshot

Tue Sep 17 19:19:10:074 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 27 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:19:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10878905, readCount 10878905)


UPD : 0 (peak 31, writeCount 2257, readCount 2257)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080123, readCount 1080123)
SPO : 0 (peak 2, writeCount 11180, readCount 11180)
UP2 : 0 (peak 1, writeCount 1076, readCount 1076)
DISP: 0 (peak 67, writeCount 417791, readCount 417791)
GW : 1 (peak 45, writeCount 9937054, readCount 9937053)
ICM : 0 (peak 186, writeCount 196767, readCount 196767)
LWP : 2 (peak 15, writeCount 17098, readCount 17096)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:19:10 2019


------------------------------------------------------------

Current snapshot id: 27


DB clean time (in percent of total time) : 23.64 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |33 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |33 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:18:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 19:18:57 2019 |
Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:19:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2538| 48| |
|
| 1|DDLOG | 2538| 48| |
|
| 2|BTCSCHED | 5078| 49| |
|
| 3|RESTART_ALL | 1015| 46| |
|
| 4|ENVCHECK | 15237| 20| |
|
| 5|AUTOABAP | 1015| 46| |
|
| 6|BGRFC_WATCHDOG | 1016| 46| |
|
| 7|AUTOTH | 1159| 56| |
|
| 8|AUTOCCMS | 5078| 49| |
|
| 9|AUTOSECURITY | 5078| 49| |
|
| 10|LOAD_CALCULATION | 304387| 0| |
|
| 11|SPOOLALRM | 5079| 49| |
|
| 12|CALL_DELAYED | 0| 5995| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 27 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:19:30:075 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:19:50:076 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:20:10:076 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-5746
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-5747
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 5748

Tue Sep 17 19:20:11:811 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5746) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5746) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5746)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5747) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5747) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5747)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:20:18:093 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:20:30:077 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 5748 terminated

Tue Sep 17 19:20:50:077 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
Tue Sep 17 19:21:10:077 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:21:30:078 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:21:50:078 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:22:10:079 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:22:30:079 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:22:50:080 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:23:10:081 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:23:30:081 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:23:50:082 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:24:10:082 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:24:30:082 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:24:50:083 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:20:10 2019, skip new
snapshot

Tue Sep 17 19:25:10:083 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 28 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:25:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10879739, readCount 10879739)


UPD : 0 (peak 31, writeCount 2258, readCount 2258)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080127, readCount 1080127)
SPO : 0 (peak 2, writeCount 11193, readCount 11193)
UP2 : 0 (peak 1, writeCount 1077, readCount 1077)
DISP: 0 (peak 67, writeCount 417836, readCount 417836)
GW : 0 (peak 45, writeCount 9937611, readCount 9937611)
ICM : 0 (peak 186, writeCount 196794, readCount 196794)
LWP : 2 (peak 15, writeCount 17113, readCount 17111)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:25:10 2019


------------------------------------------------------------

Current snapshot id: 28


DB clean time (in percent of total time) : 23.65 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |34 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |34 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:24:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 19:24:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:25:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2541| 48| |
|
| 1|DDLOG | 2541| 48| |
|
| 2|BTCSCHED | 5084| 49| |
|
| 3|RESTART_ALL | 1017| 286| |
|
| 4|ENVCHECK | 15255| 20| |
|
| 5|AUTOABAP | 1017| 286| |
|
| 6|BGRFC_WATCHDOG | 1018| 286| |
|
| 7|AUTOTH | 1165| 56| |
|
| 8|AUTOCCMS | 5084| 49| |
|
| 9|AUTOSECURITY | 5084| 49| |
|
| 10|LOAD_CALCULATION | 304745| 0| |
|
| 11|SPOOLALRM | 5085| 49| |
|
| 12|CALL_DELAYED | 0| 5635| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 28 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-7518
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:25:10:089 2019


DpWpDynCreate: created new work process W12-7519

Tue Sep 17 19:25:11:820 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7518) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7518) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7518)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7519) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7519) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7519)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:25:30:083 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:25:50:084 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:26:10:084 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 7838

Tue Sep 17 19:26:17:591 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:26:30:085 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 7838 terminated

Tue Sep 17 19:26:50:086 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:27:10:086 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:27:30:087 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:27:50:087 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:28:10:088 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:28:30:089 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:28:50:090 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:29:10:091 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:29:30:092 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
Tue Sep 17 19:29:50:093 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:30:10:093 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-9481
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-9482

Tue Sep 17 19:30:11:826 2019


*** ERROR => DpHdlDeadWp: W1 (pid 9481) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9481) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 9481)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 9482) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9482) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 9482)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:30:30:094 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:30:50:095 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:26:10 2019, skip new
snapshot

Tue Sep 17 19:31:10:095 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 29 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:31:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:31:10 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10880544, readCount 10880544)


UPD : 0 (peak 31, writeCount 2259, readCount 2259)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080131, readCount 1080131)
SPO : 0 (peak 2, writeCount 11206, readCount 11206)
UP2 : 0 (peak 1, writeCount 1078, readCount 1078)
DISP: 0 (peak 67, writeCount 417877, readCount 417877)
GW : 0 (peak 45, writeCount 9938155, readCount 9938155)
ICM : 0 (peak 186, writeCount 196823, readCount 196823)
LWP : 0 (peak 15, writeCount 17128, readCount 17128)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:


Queue <DispatcherQueue> in slot 0 (port=18802) has no requests
Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:31:10 2019


------------------------------------------------------------

Current snapshot id: 29


DB clean time (in percent of total time) : 23.66 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |36 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |36 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:30:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 19:30:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:31:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2544| 48| |
|
| 1|DDLOG | 2544| 48| |
|
| 2|BTCSCHED | 5090| 49| |
|
| 3|RESTART_ALL | 1018| 226| |
|
| 4|ENVCHECK | 15273| 20| |
|
| 5|AUTOABAP | 1018| 226| |
|
| 6|BGRFC_WATCHDOG | 1019| 226| |
|
| 7|AUTOTH | 1171| 56| |
|
| 8|AUTOCCMS | 5090| 49| |
|
| 9|AUTOSECURITY | 5090| 49| |
|
| 10|LOAD_CALCULATION | 305104| 0| |
|
| 11|SPOOLALRM | 5091| 49| |
|
| 12|CALL_DELAYED | 0| 5275| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 29 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 19:31:30:097 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:31:50:097 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:32:10:098 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 10010

Tue Sep 17 19:32:17:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:32:30:099 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 10010 terminated

Tue Sep 17 19:32:50:099 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:33:10:099 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:33:30:101 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:33:50:101 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:34:10:102 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:34:30:103 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:34:50:103 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:35:10:104 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-11296
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-11297

Tue Sep 17 19:35:11:681 2019


*** ERROR => DpHdlDeadWp: W1 (pid 11296) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11296) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 11296)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 11297) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11297) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 11297)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
Tue Sep 17 19:35:30:105 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:35:50:105 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:36:10:106 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:36:30:107 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:36:50:108 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:32:10 2019, skip new
snapshot

Tue Sep 17 19:37:10:108 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 30 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:37:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10881384, readCount 10881384)


UPD : 0 (peak 31, writeCount 2260, readCount 2260)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080135, readCount 1080135)
SPO : 0 (peak 2, writeCount 11219, readCount 11219)
UP2 : 0 (peak 1, writeCount 1079, readCount 1079)
DISP: 0 (peak 67, writeCount 417918, readCount 417918)
GW : 0 (peak 45, writeCount 9938723, readCount 9938723)
ICM : 0 (peak 186, writeCount 196850, readCount 196850)
LWP : 0 (peak 15, writeCount 17143, readCount 17143)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:37:10 2019


------------------------------------------------------------
Current snapshot id: 30
DB clean time (in percent of total time) : 23.67 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |37 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |37 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:36:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 19:36:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Tue Sep 17 19:37:10 2019
------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2547| 48| |
|
| 1|DDLOG | 2547| 48| |
|
| 2|BTCSCHED | 5096| 49| |
|
| 3|RESTART_ALL | 1019| 166| |
|
| 4|ENVCHECK | 15291| 20| |
|
| 5|AUTOABAP | 1019| 166| |
|
| 6|BGRFC_WATCHDOG | 1020| 166| |
|
| 7|AUTOTH | 1177| 56| |
|
| 8|AUTOCCMS | 5096| 49| |
|
| 9|AUTOSECURITY | 5096| 49| |
|
| 10|LOAD_CALCULATION | 305462| 0| |
|
| 11|SPOOLALRM | 5097| 49| |
|
| 12|CALL_DELAYED | 0| 4915| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 30 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:37:30:108 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:37:50:109 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:38:10:109 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 12361

Tue Sep 17 19:38:17:753 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:38:30:110 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 12361 terminated

Tue Sep 17 19:38:50:111 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:39:10:111 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:39:30:112 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:39:50:113 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:40:10:114 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-13227
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-13228

Tue Sep 17 19:40:11:849 2019


*** ERROR => DpHdlDeadWp: W1 (pid 13227) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13227) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 13227)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 13228) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13228) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13228)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:40:30:115 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:40:50:115 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:41:10:115 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:41:30:116 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:41:50:117 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:42:10:118 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:42:30:119 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:42:50:120 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:38:10 2019, skip new
snapshot

Tue Sep 17 19:43:10:121 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 31 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:43:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10882214, readCount 10882214)


UPD : 0 (peak 31, writeCount 2262, readCount 2262)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080143, readCount 1080143)
SPO : 0 (peak 2, writeCount 11233, readCount 11233)
UP2 : 0 (peak 1, writeCount 1081, readCount 1081)
DISP: 0 (peak 67, writeCount 417959, readCount 417959)
GW : 0 (peak 45, writeCount 9939297, readCount 9939297)
ICM : 1 (peak 186, writeCount 196879, readCount 196878)
LWP : 2 (peak 15, writeCount 17173, readCount 17171)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:43:10 2019


------------------------------------------------------------

Current snapshot id: 31


DB clean time (in percent of total time) : 23.68 %
Number of preemptions : 78
|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|
Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |38 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |38 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:42:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 19:42:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:43:10 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2550| 48| |
|
| 1|DDLOG | 2550| 48| |
|
| 2|BTCSCHED | 5102| 49| |
|
| 3|RESTART_ALL | 1020| 106| |
|
| 4|ENVCHECK | 15309| 20| |
|
| 5|AUTOABAP | 1020| 106| |
|
| 6|BGRFC_WATCHDOG | 1021| 106| |
|
| 7|AUTOTH | 1183| 56| |
|
| 8|AUTOCCMS | 5102| 49| |
|
| 9|AUTOSECURITY | 5102| 49| |
|
| 10|LOAD_CALCULATION | 305821| 1| |
|
| 11|SPOOLALRM | 5103| 49| |
|
| 12|CALL_DELAYED | 0| 4555| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 31 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:43:30:121 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:43:50:121 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 19:44:10:122 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 14217

Tue Sep 17 19:44:17:656 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:44:30:123 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 14217 terminated

Tue Sep 17 19:44:50:123 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:45:10:124 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-14848
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-14849

Tue Sep 17 19:45:11:862 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14848) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14848) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14848)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14849) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14849) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14849)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:45:30:124 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:45:50:125 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:46:10:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:46:30:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:46:50:127 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:47:10:127 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:47:30:127 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:47:50:128 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:48:10:129 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:48:30:129 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:48:50:130 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:44:10 2019, skip new
snapshot

Tue Sep 17 19:49:10:130 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 32 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:49:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10883044, readCount 10883044)


UPD : 0 (peak 31, writeCount 2263, readCount 2263)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080147, readCount 1080147)
SPO : 0 (peak 2, writeCount 11246, readCount 11246)
UP2 : 0 (peak 1, writeCount 1082, readCount 1082)
DISP: 0 (peak 67, writeCount 418000, readCount 418000)
GW : 0 (peak 45, writeCount 9939859, readCount 9939859)
ICM : 0 (peak 186, writeCount 196906, readCount 196906)
LWP : 2 (peak 15, writeCount 17188, readCount 17186)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:49:10 2019


------------------------------------------------------------

Current snapshot id: 32


DB clean time (in percent of total time) : 23.68 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |39 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |39 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:48:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 19:48:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:49:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 19:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2553| 48| |
|
| 1|DDLOG | 2553| 48| |
|
| 2|BTCSCHED | 5108| 49| |
|
| 3|RESTART_ALL | 1021| 46| |
|
| 4|ENVCHECK | 15327| 20| |
|
| 5|AUTOABAP | 1021| 46| |
|
| 6|BGRFC_WATCHDOG | 1022| 46| |
|
| 7|AUTOTH | 1189| 56| |
|
| 8|AUTOCCMS | 5108| 49| |
|
| 9|AUTOSECURITY | 5108| 49| |
|
| 10|LOAD_CALCULATION | 306180| 1| |
|
| 11|SPOOLALRM | 5109| 49| |
|
| 12|CALL_DELAYED | 0| 4195| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 32 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:49:30:131 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:49:50:132 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:50:10:132 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-16336
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-16337
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 16338

Tue Sep 17 19:50:11:817 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16336) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16336) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16336)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16337) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16337) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16337)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:50:18:075 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:50:30:133 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 16338 terminated

Tue Sep 17 19:50:50:134 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:51:10:135 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:51:30:135 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:51:50:136 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:52:10:137 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:52:30:138 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:52:50:138 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:53:10:139 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:53:30:139 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:53:50:140 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:54:10:140 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:54:30:141 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:54:50:141 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:50:10 2019, skip new
snapshot

Tue Sep 17 19:55:10:143 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 33 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:55:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 19:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 19:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10883857, readCount 10883857)


UPD : 0 (peak 31, writeCount 2264, readCount 2264)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080151, readCount 1080151)
SPO : 0 (peak 2, writeCount 11259, readCount 11259)
UP2 : 0 (peak 1, writeCount 1083, readCount 1083)
DISP: 0 (peak 67, writeCount 418044, readCount 418044)
GW : 0 (peak 45, writeCount 9940426, readCount 9940426)
ICM : 0 (peak 186, writeCount 196933, readCount 196933)
LWP : 2 (peak 15, writeCount 17203, readCount 17201)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 19:55:10 2019


------------------------------------------------------------

Current snapshot id: 33


DB clean time (in percent of total time) : 23.69 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |40 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |40 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|
| 16|404 |DIA |WP_RUN | | |norm|T110_U932_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 19:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |19:54:57|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T110_U932_M0 |001|SM_EXTERN_WS|10.54.36.37 |19:55:09|16 |
SAPMHTTP |norm| |
| | 4590|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 19:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Tue Sep 17 19:54:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 19:55:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884
Periodic Tasks Tue Sep 17 19:55:10 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2556| 48| |
|
| 1|DDLOG | 2556| 48| |
|
| 2|BTCSCHED | 5114| 49| |
|
| 3|RESTART_ALL | 1023| 286| |
|
| 4|ENVCHECK | 15345| 20| |
|
| 5|AUTOABAP | 1023| 286| |
|
| 6|BGRFC_WATCHDOG | 1024| 286| |
|
| 7|AUTOTH | 1195| 56| |
|
| 8|AUTOCCMS | 5114| 49| |
|
| 9|AUTOSECURITY | 5114| 49| |
|
| 10|LOAD_CALCULATION | 306538| 0| |
|
| 11|SPOOLALRM | 5115| 49| |
|
| 12|CALL_DELAYED | 0| 3835| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 33 (Reason: Workprocess 1 died / Time: Tue Sep 17


19:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-17997
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:55:10:149 2019


DpWpDynCreate: created new work process W12-17998

Tue Sep 17 19:55:11:885 2019


*** ERROR => DpHdlDeadWp: W1 (pid 17997) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17997) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 17997)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 17998) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17998) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 17998)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 19:55:30:143 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:55:50:143 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 19:56:10:144 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 18451

Tue Sep 17 19:56:18:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:56:30:144 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 18451 terminated

Tue Sep 17 19:56:50:145 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:57:10:145 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:57:30:145 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:57:50:146 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:58:10:146 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:58:30:147 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:58:50:147 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:59:10:148 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:59:30:148 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 19:59:50:148 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
Tue Sep 17 20:00:10:149 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-19905
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-19906

Tue Sep 17 20:00:11:871 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19905) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19905) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19905)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19906) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19906) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19906)
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 20:00:30:149 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 20:00:50:150 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 19:56:10 2019, skip new
snapshot

Tue Sep 17 20:01:10:151 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 34 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:01:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10884835, readCount 10884835)


UPD : 0 (peak 31, writeCount 2265, readCount 2265)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080155, readCount 1080155)
SPO : 0 (peak 2, writeCount 11272, readCount 11272)
UP2 : 0 (peak 1, writeCount 1084, readCount 1084)
DISP: 0 (peak 67, writeCount 418085, readCount 418085)
GW : 0 (peak 45, writeCount 9941124, readCount 9941124)
ICM : 0 (peak 186, writeCount 196960, readCount 196960)
LWP : 0 (peak 15, writeCount 17218, readCount 17218)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:01:10 2019


------------------------------------------------------------
Current snapshot id: 34
DB clean time (in percent of total time) : 23.70 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |42 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |42 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 20:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:00:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 20:00:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Tue Sep 17 20:01:10 2019
------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2559| 48| |
|
| 1|DDLOG | 2559| 48| |
|
| 2|BTCSCHED | 5120| 49| |
|
| 3|RESTART_ALL | 1024| 226| |
|
| 4|ENVCHECK | 15363| 20| |
|
| 5|AUTOABAP | 1024| 226| |
|
| 6|BGRFC_WATCHDOG | 1025| 226| |
|
| 7|AUTOTH | 1201| 56| |
|
| 8|AUTOCCMS | 5120| 49| |
|
| 9|AUTOSECURITY | 5120| 49| |
|
| 10|LOAD_CALCULATION | 306897| 0| |
|
| 11|SPOOLALRM | 5121| 49| |
|
| 12|CALL_DELAYED | 0| 3475| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 34 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:01:30:151 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:01:50:151 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:02:10:152 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23223

Tue Sep 17 20:02:18:315 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:02:30:153 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 23223 terminated

Tue Sep 17 20:02:50:153 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:03:10:154 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:03:30:154 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:03:50:155 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:04:10:156 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:04:30:156 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:04:50:157 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:05:10:157 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-2115
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-2116

Tue Sep 17 20:05:11:887 2019


*** ERROR => DpHdlDeadWp: W1 (pid 2115) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2115) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 2115)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 2116) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2116) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 2116)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:05:30:158 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:05:50:159 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:06:10:159 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:06:30:160 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:06:50:161 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:02:10 2019, skip new
snapshot

Tue Sep 17 20:07:10:161 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 35 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:07:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10885927, readCount 10885927)


UPD : 0 (peak 31, writeCount 2266, readCount 2266)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080159, readCount 1080159)
SPO : 0 (peak 2, writeCount 11285, readCount 11285)
UP2 : 0 (peak 1, writeCount 1085, readCount 1085)
DISP: 0 (peak 67, writeCount 418126, readCount 418126)
GW : 0 (peak 45, writeCount 9941950, readCount 9941950)
ICM : 0 (peak 186, writeCount 196987, readCount 196987)
LWP : 0 (peak 15, writeCount 17233, readCount 17233)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:07:10 2019


------------------------------------------------------------

Current snapshot id: 35


DB clean time (in percent of total time) : 23.71 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |43 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |43 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 20:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:06:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 20:06:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:07:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:07:10 2019


------------------------------------------------------------
|Handle |Type |Calls |Wait(sec) |Session |Resp-ID
|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2562| 48| |
|
| 1|DDLOG | 2562| 48| |
|
| 2|BTCSCHED | 5126| 49| |
|
| 3|RESTART_ALL | 1025| 166| |
|
| 4|ENVCHECK | 15381| 20| |
|
| 5|AUTOABAP | 1025| 166| |
|
| 6|BGRFC_WATCHDOG | 1026| 166| |
|
| 7|AUTOTH | 1207| 56| |
|
| 8|AUTOCCMS | 5126| 49| |
|
| 9|AUTOSECURITY | 5126| 49| |
|
| 10|LOAD_CALCULATION | 307256| 0| |
|
| 11|SPOOLALRM | 5127| 49| |
|
| 12|CALL_DELAYED | 0| 3115| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 35 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:07:30:162 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:07:50:163 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:08:10:167 2019


DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 8174
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:08:17:705 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:08:30:168 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 8174 terminated

Tue Sep 17 20:08:50:168 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:09:10:169 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:09:30:169 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:09:50:169 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:10:10:170 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-16235
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-16236

Tue Sep 17 20:10:11:904 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16235) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16235) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16235)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16236) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16236) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16236)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:10:30:171 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:10:50:171 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:11:10:171 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:11:30:172 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:11:50:173 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
Tue Sep 17 20:12:10:174 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:12:30:175 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:12:50:175 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:08:10 2019, skip new
snapshot

Tue Sep 17 20:13:10:176 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 36 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:13:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10886803, readCount 10886803)


UPD : 0 (peak 31, writeCount 2268, readCount 2268)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080167, readCount 1080167)
SPO : 0 (peak 2, writeCount 11299, readCount 11299)
UP2 : 0 (peak 1, writeCount 1087, readCount 1087)
DISP: 0 (peak 67, writeCount 418167, readCount 418167)
GW : 0 (peak 45, writeCount 9942568, readCount 9942568)
ICM : 0 (peak 186, writeCount 197016, readCount 197016)
LWP : 2 (peak 15, writeCount 17263, readCount 17261)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:13:10 2019


------------------------------------------------------------

Current snapshot id: 36


DB clean time (in percent of total time) : 23.72 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |44 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |44 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 20:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:12:57|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Tue Sep 17 20:12:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:13:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2565| 48| |
|
| 1|DDLOG | 2565| 48| |
|
| 2|BTCSCHED | 5132| 49| |
|
| 3|RESTART_ALL | 1026| 106| |
|
| 4|ENVCHECK | 15399| 20| |
|
| 5|AUTOABAP | 1026| 106| |
|
| 6|BGRFC_WATCHDOG | 1027| 106| |
|
| 7|AUTOTH | 1213| 56| |
|
| 8|AUTOCCMS | 5132| 49| |
|
| 9|AUTOSECURITY | 5132| 49| |
|
| 10|LOAD_CALCULATION | 307615| 1| |
|
| 11|SPOOLALRM | 5133| 49| |
|
| 12|CALL_DELAYED | 0| 2755| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 36 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:13:30:177 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:13:50:177 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:14:10:178 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 20109
Tue Sep 17 20:14:17:956 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:14:30:178 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 20109 terminated

Tue Sep 17 20:14:50:178 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:15:10:179 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-20959
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-20960

Tue Sep 17 20:15:11:714 2019


*** ERROR => DpHdlDeadWp: W1 (pid 20959) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20959) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 20959)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 20960) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20960) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 20960)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:15:30:180 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:15:50:180 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:16:10:180 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:16:30:181 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:16:50:181 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:17:10:181 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:17:30:182 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:17:50:182 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:18:10:182 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:18:30:183 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:18:50:184 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:14:10 2019, skip new
snapshot

Tue Sep 17 20:19:10:184 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 37 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:19:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10887639, readCount 10887639)


UPD : 0 (peak 31, writeCount 2269, readCount 2269)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080171, readCount 1080171)
SPO : 0 (peak 2, writeCount 11312, readCount 11312)
UP2 : 0 (peak 1, writeCount 1088, readCount 1088)
DISP: 0 (peak 67, writeCount 418208, readCount 418208)
GW : 0 (peak 45, writeCount 9943148, readCount 9943148)
ICM : 0 (peak 186, writeCount 197043, readCount 197043)
LWP : 2 (peak 15, writeCount 17278, readCount 17276)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:19:10 2019


------------------------------------------------------------

Current snapshot id: 37


DB clean time (in percent of total time) : 23.73 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |45 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |45 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|
Found 2 active workprocesses
Total number of workprocesses is 16

Session Table Tue Sep 17 20:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:18:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 20:18:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:19:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2568| 48| |
|
| 1|DDLOG | 2568| 48| |
|
| 2|BTCSCHED | 5138| 49| |
|
| 3|RESTART_ALL | 1027| 46| |
|
| 4|ENVCHECK | 15417| 20| |
|
| 5|AUTOABAP | 1027| 46| |
|
| 6|BGRFC_WATCHDOG | 1028| 46| |
|
| 7|AUTOTH | 1219| 56| |
|
| 8|AUTOCCMS | 5138| 49| |
|
| 9|AUTOSECURITY | 5138| 49| |
|
| 10|LOAD_CALCULATION | 307973| 0| |
|
| 11|SPOOLALRM | 5139| 49| |
|
| 12|CALL_DELAYED | 0| 2395| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 37 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:19:30:184 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:19:50:184 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:20:10:185 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-22279
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-22280
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22281

Tue Sep 17 20:20:11:680 2019


*** ERROR => DpHdlDeadWp: W1 (pid 22279) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22279) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 22279)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 22280) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22280) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 22280)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:20:18:411 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:20:30:186 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22281 terminated

Tue Sep 17 20:20:50:187 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:21:10:187 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:21:30:188 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:21:50:189 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
Tue Sep 17 20:22:10:190 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:22:30:191 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:22:50:191 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:23:10:192 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:23:30:192 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:23:50:193 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:24:10:194 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:24:30:195 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:24:50:195 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:20:10 2019, skip new
snapshot

Tue Sep 17 20:25:10:195 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 38 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:25:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10888454, readCount 10888454)


UPD : 0 (peak 31, writeCount 2270, readCount 2270)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080175, readCount 1080175)
SPO : 0 (peak 2, writeCount 11325, readCount 11325)
UP2 : 0 (peak 1, writeCount 1089, readCount 1089)
DISP: 0 (peak 67, writeCount 418253, readCount 418253)
GW : 1 (peak 45, writeCount 9943697, readCount 9943696)
ICM : 0 (peak 186, writeCount 197070, readCount 197070)
LWP : 2 (peak 15, writeCount 17293, readCount 17291)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:25:10 2019


------------------------------------------------------------

Current snapshot id: 38


DB clean time (in percent of total time) : 23.74 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |46 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |46 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16
Session Table Tue Sep 17 20:25:10 2019
------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:24:57|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Tue Sep 17 20:24:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:25:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2571| 48| |
|
| 1|DDLOG | 2571| 48| |
|
| 2|BTCSCHED | 5144| 49| |
|
| 3|RESTART_ALL | 1029| 286| |
|
| 4|ENVCHECK | 15435| 20| |
|
| 5|AUTOABAP | 1029| 286| |
|
| 6|BGRFC_WATCHDOG | 1030| 286| |
|
| 7|AUTOTH | 1225| 56| |
|
| 8|AUTOCCMS | 5144| 49| |
|
| 9|AUTOSECURITY | 5144| 49| |
|
| 10|LOAD_CALCULATION | 308331| 0| |
|
| 11|SPOOLALRM | 5145| 49| |
|
| 12|CALL_DELAYED | 0| 2035| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 38 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-24377
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:25:10:202 2019


DpWpDynCreate: created new work process W12-24378

Tue Sep 17 20:25:11:914 2019


*** ERROR => DpHdlDeadWp: W1 (pid 24377) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24377) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 24377)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 24378) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24378) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 24378)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:25:30:196 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:25:50:197 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 20:26:10:197 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 24676

Tue Sep 17 20:26:18:410 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:26:30:197 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 24676 terminated

Tue Sep 17 20:26:50:197 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:27:10:198 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:27:30:199 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:27:50:199 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:28:10:199 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:28:30:199 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:28:50:199 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:29:10:200 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:29:30:201 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:29:50:202 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:30:10:202 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-26156
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-26157

Tue Sep 17 20:30:11:955 2019


*** ERROR => DpHdlDeadWp: W1 (pid 26156) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26156) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 26156)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 26157) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26157) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 26157)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:30:30:203 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:30:50:204 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:26:10 2019, skip new
snapshot

Tue Sep 17 20:31:10:204 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 39 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:31:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10889285, readCount 10889285)


UPD : 0 (peak 31, writeCount 2271, readCount 2271)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080179, readCount 1080179)
SPO : 0 (peak 2, writeCount 11338, readCount 11338)
UP2 : 0 (peak 1, writeCount 1090, readCount 1090)
DISP: 0 (peak 67, writeCount 418294, readCount 418294)
GW : 0 (peak 45, writeCount 9944253, readCount 9944253)
ICM : 0 (peak 186, writeCount 197099, readCount 197099)
LWP : 0 (peak 15, writeCount 17308, readCount 17308)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:31:10 2019


------------------------------------------------------------

Current snapshot id: 39


DB clean time (in percent of total time) : 23.75 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |48 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |48 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 20:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:30:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 20:30:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:31:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2574| 48| |
|
| 1|DDLOG | 2574| 48| |
|
| 2|BTCSCHED | 5150| 49| |
|
| 3|RESTART_ALL | 1030| 226| |
|
| 4|ENVCHECK | 15453| 20| |
|
| 5|AUTOABAP | 1030| 226| |
|
| 6|BGRFC_WATCHDOG | 1031| 226| |
|
| 7|AUTOTH | 1231| 56| |
|
| 8|AUTOCCMS | 5150| 49| |
|
| 9|AUTOSECURITY | 5150| 49| |
|
| 10|LOAD_CALCULATION | 308690| 1| |
|
| 11|SPOOLALRM | 5151| 49| |
|
| 12|CALL_DELAYED | 0| 1675| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 39 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:31:30:204 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:31:50:205 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:32:10:205 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 26694

Tue Sep 17 20:32:18:582 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:32:30:206 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 26694 terminated

Tue Sep 17 20:32:50:207 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:33:10:207 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:33:30:207 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:33:50:207 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:34:10:208 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:34:30:209 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:34:50:210 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:35:10:211 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-28028
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-28029

Tue Sep 17 20:35:12:004 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28028) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28028) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28028)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 28029) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28029) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28029)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:35:30:212 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:35:50:212 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:36:10:213 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:36:30:213 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:36:50:214 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:32:10 2019, skip new
snapshot

Tue Sep 17 20:37:10:214 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 40 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:37:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10890112, readCount 10890112)


UPD : 0 (peak 31, writeCount 2272, readCount 2272)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080183, readCount 1080183)
SPO : 0 (peak 2, writeCount 11351, readCount 11351)
UP2 : 0 (peak 1, writeCount 1091, readCount 1091)
DISP: 0 (peak 67, writeCount 418334, readCount 418334)
GW : 0 (peak 45, writeCount 9944813, readCount 9944813)
ICM : 0 (peak 186, writeCount 197126, readCount 197126)
LWP : 0 (peak 15, writeCount 17323, readCount 17323)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:37:10 2019


------------------------------------------------------------

Current snapshot id: 40


DB clean time (in percent of total time) : 23.76 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |49 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 5|8468 |DIA |WP_RUN | | |norm|T24_U3855_M0 |SYNC_RFC | | |
0|<HANDLE RFC> |001|SAPJSF |READDIR |
USR02 |
| 12| |BTC |WP_KILL| |49 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16
Session Table Tue Sep 17 20:37:10 2019
------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T24_U3855_M0 |001|SAPJSF |smprd02.niladv.org |20:37:10|5 |
SAPMSSY1 |norm| |
| | 4200|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:36:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (2 entries) Tue Sep 17 20:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 26|78029214|78029214SU3855_M0 |T24_U3855_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 5|Tue Sep 17 20:37:10 2019 |
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 20:36:57 2019 |

Found 2 RFC-Connections

CA Blocks
------------------------------------------------------------
333 WORKER 8468
1 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:37:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2577| 48| |
|
| 1|DDLOG | 2577| 48| |
|
| 2|BTCSCHED | 5156| 49| |
|
| 3|RESTART_ALL | 1031| 166| |
|
| 4|ENVCHECK | 15471| 20| |
|
| 5|AUTOABAP | 1031| 166| |
|
| 6|BGRFC_WATCHDOG | 1032| 166| |
|
| 7|AUTOTH | 1237| 56| |
|
| 8|AUTOCCMS | 5156| 49| |
|
| 9|AUTOSECURITY | 5156| 49| |
|
| 10|LOAD_CALCULATION | 309049| 1| |
|
| 11|SPOOLALRM | 5157| 49| |
|
| 12|CALL_DELAYED | 0| 1315| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 40 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:37:30:214 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:37:50:216 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:38:10:216 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 28752

Tue Sep 17 20:38:18:811 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:38:30:217 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 28752 terminated

Tue Sep 17 20:38:50:217 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:39:10:217 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:39:30:217 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:39:50:218 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:40:10:219 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-29648
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-29649

Tue Sep 17 20:40:11:943 2019


*** ERROR => DpHdlDeadWp: W1 (pid 29648) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29648) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 29648)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 29649) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29649) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 29649)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:40:30:220 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:40:50:220 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:41:10:221 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:41:30:221 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:41:50:222 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:42:10:223 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:42:30:223 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:42:50:224 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:38:10 2019, skip new
snapshot

Tue Sep 17 20:43:10:224 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 41 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:43:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10890903, readCount 10890903)


UPD : 0 (peak 31, writeCount 2274, readCount 2274)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080191, readCount 1080191)
SPO : 0 (peak 2, writeCount 11365, readCount 11365)
UP2 : 0 (peak 1, writeCount 1093, readCount 1093)
DISP: 0 (peak 67, writeCount 418375, readCount 418375)
GW : 0 (peak 45, writeCount 9945349, readCount 9945349)
ICM : 0 (peak 186, writeCount 197155, readCount 197155)
LWP : 2 (peak 15, writeCount 17353, readCount 17351)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:43:10 2019


------------------------------------------------------------

Current snapshot id: 41


DB clean time (in percent of total time) : 23.76 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |50 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |50 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 20:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:42:57|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Tue Sep 17 20:42:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:43:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2580| 48| |
|
| 1|DDLOG | 2580| 48| |
|
| 2|BTCSCHED | 5162| 49| |
|
| 3|RESTART_ALL | 1032| 106| |
|
| 4|ENVCHECK | 15489| 20| |
|
| 5|AUTOABAP | 1032| 106| |
|
| 6|BGRFC_WATCHDOG | 1033| 106| |
|
| 7|AUTOTH | 1243| 56| |
|
| 8|AUTOCCMS | 5162| 49| |
|
| 9|AUTOSECURITY | 5162| 49| |
|
| 10|LOAD_CALCULATION | 309408| 1| |
|
| 11|SPOOLALRM | 5163| 49| |
|
| 12|CALL_DELAYED | 0| 955| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 41 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:43:30:225 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:43:50:226 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:44:10:227 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 30936

Tue Sep 17 20:44:18:491 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:44:30:227 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 30936 terminated

Tue Sep 17 20:44:50:228 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:45:10:229 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-31596
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-31597

Tue Sep 17 20:45:11:956 2019


*** ERROR => DpHdlDeadWp: W1 (pid 31596) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31596) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 31596)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 31597) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31597) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 31597)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:45:30:230 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:45:50:230 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
Tue Sep 17 20:46:10:230 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:46:30:230 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:46:50:230 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:47:10:231 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:47:30:232 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:47:50:233 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:48:10:233 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:48:30:235 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:48:50:236 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:44:10 2019, skip new
snapshot

Tue Sep 17 20:49:10:236 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 42 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:49:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:49:10 2019


------------------------------------------------------------
Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10891705, readCount 10891705)


UPD : 0 (peak 31, writeCount 2275, readCount 2275)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080195, readCount 1080195)
SPO : 0 (peak 2, writeCount 11378, readCount 11378)
UP2 : 0 (peak 1, writeCount 1094, readCount 1094)
DISP: 0 (peak 67, writeCount 418416, readCount 418416)
GW : 1 (peak 45, writeCount 9945889, readCount 9945888)
ICM : 0 (peak 186, writeCount 197182, readCount 197182)
LWP : 2 (peak 15, writeCount 17368, readCount 17366)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:49:10 2019


------------------------------------------------------------

Current snapshot id: 42


DB clean time (in percent of total time) : 23.77 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |51 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |51 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 20:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:48:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 20:48:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:49:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2583| 48| |
|
| 1|DDLOG | 2583| 48| |
|
| 2|BTCSCHED | 5168| 49| |
|
| 3|RESTART_ALL | 1033| 46| |
|
| 4|ENVCHECK | 15507| 20| |
|
| 5|AUTOABAP | 1033| 46| |
|
| 6|BGRFC_WATCHDOG | 1034| 46| |
|
| 7|AUTOTH | 1249| 56| |
|
| 8|AUTOCCMS | 5168| 49| |
|
| 9|AUTOSECURITY | 5168| 49| |
|
| 10|LOAD_CALCULATION | 309767| 1| |
|
| 11|SPOOLALRM | 5169| 49| |
|
| 12|CALL_DELAYED | 0| 595| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 42 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:49:30:237 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:49:50:238 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:50:10:238 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-481
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-482
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 483

Tue Sep 17 20:50:11:928 2019


*** ERROR => DpHdlDeadWp: W1 (pid 481) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=481) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 481)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 482) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=482) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 482)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:50:18:480 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:50:30:239 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 483 terminated

Tue Sep 17 20:50:50:239 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:51:10:239 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:51:30:240 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:51:50:240 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:52:10:241 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:52:30:242 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:52:50:243 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:53:10:244 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:53:30:245 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:53:50:246 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:54:10:247 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:54:30:247 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:54:50:247 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:50:10 2019, skip new
snapshot

Tue Sep 17 20:55:10:248 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 43 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:55:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 20:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 20:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2


Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10892504, readCount 10892504)


UPD : 0 (peak 31, writeCount 2276, readCount 2276)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080199, readCount 1080199)
SPO : 0 (peak 2, writeCount 11391, readCount 11391)
UP2 : 0 (peak 1, writeCount 1095, readCount 1095)
DISP: 0 (peak 67, writeCount 418461, readCount 418461)
GW : 0 (peak 45, writeCount 9946432, readCount 9946432)
ICM : 0 (peak 186, writeCount 197209, readCount 197209)
LWP : 2 (peak 15, writeCount 17383, readCount 17381)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 20:55:10 2019


------------------------------------------------------------

Current snapshot id: 43


DB clean time (in percent of total time) : 23.78 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |52 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |52 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 20:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |20:54:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 20:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 20:54:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 20:55:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 20:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2586| 48| |
|
| 1|DDLOG | 2586| 48| |
|
| 2|BTCSCHED | 5174| 49| |
|
| 3|RESTART_ALL | 1035| 286| |
|
| 4|ENVCHECK | 15525| 20| |
|
| 5|AUTOABAP | 1035| 286| |
|
| 6|BGRFC_WATCHDOG | 1036| 286| |
|
| 7|AUTOTH | 1255| 56| |
|
| 8|AUTOCCMS | 5174| 49| |
|
| 9|AUTOSECURITY | 5174| 49| |
|
| 10|LOAD_CALCULATION | 310125| 0| |
|
| 11|SPOOLALRM | 5175| 49| |
|
| 12|CALL_DELAYED | 0| 235| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 43 (Reason: Workprocess 1 died / Time: Tue Sep 17


20:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-2672
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:55:10:254 2019


DpWpDynCreate: created new work process W12-2673

Tue Sep 17 20:55:11:973 2019


*** ERROR => DpHdlDeadWp: W1 (pid 2672) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2672) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 2672)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 2673) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2673) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 2673)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:55:30:249 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:55:50:250 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 20:56:10:251 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 2947

Tue Sep 17 20:56:19:222 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:56:30:252 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 2947 terminated

Tue Sep 17 20:56:50:252 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:57:10:252 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:57:30:253 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:57:50:253 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:58:10:253 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
Tue Sep 17 20:58:30:254 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:58:50:254 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:59:10:255 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:59:30:256 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 20:59:50:256 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 21:00:10:257 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-4347
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-4348

Tue Sep 17 21:00:12:149 2019


*** ERROR => DpHdlDeadWp: W1 (pid 4347) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4347) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 4347)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 4348) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4348) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 4348)
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 21:00:30:257 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 21:00:50:258 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 20:56:10 2019, skip new
snapshot

Tue Sep 17 21:01:10:258 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 44 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:01:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10893305, readCount 10893305)


UPD : 0 (peak 31, writeCount 2277, readCount 2277)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080203, readCount 1080203)
SPO : 0 (peak 2, writeCount 11404, readCount 11404)
UP2 : 0 (peak 1, writeCount 1096, readCount 1096)
DISP: 0 (peak 67, writeCount 418505, readCount 418505)
GW : 0 (peak 45, writeCount 9946964, readCount 9946964)
ICM : 0 (peak 186, writeCount 197236, readCount 197236)
LWP : 0 (peak 15, writeCount 17398, readCount 17398)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:01:10 2019


------------------------------------------------------------

Current snapshot id: 44


DB clean time (in percent of total time) : 23.79 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |54 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |54 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16
Session Table Tue Sep 17 21:01:10 2019
------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:00:57|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 21:00:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:01:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2589| 48| |
|
| 1|DDLOG | 2589| 48| |
|
| 2|BTCSCHED | 5180| 49| |
|
| 3|RESTART_ALL | 1036| 226| |
|
| 4|ENVCHECK | 15543| 20| |
|
| 5|AUTOABAP | 1036| 226| |
|
| 6|BGRFC_WATCHDOG | 1037| 226| |
|
| 7|AUTOTH | 1261| 56| |
|
| 8|AUTOCCMS | 5180| 49| |
|
| 9|AUTOSECURITY | 5180| 49| |
|
| 10|LOAD_CALCULATION | 310484| 1| |
|
| 11|SPOOLALRM | 5181| 49| |
|
| 12|CALL_DELAYED | 0| 4339| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 44 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:01:30:258 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:01:50:258 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:02:10:258 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 7990

Tue Sep 17 21:02:19:172 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
Tue Sep 17 21:02:30:259 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 7990 terminated

Tue Sep 17 21:02:50:259 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:03:10:260 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:03:30:260 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:03:50:260 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:04:10:260 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:04:30:260 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:04:50:261 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:05:10:262 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-20033
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-20034

Tue Sep 17 21:05:11:974 2019


*** ERROR => DpHdlDeadWp: W1 (pid 20033) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20033) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 20033)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 20034) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20034) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 20034)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:05:30:262 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:05:50:262 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:06:10:263 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:06:30:263 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:06:50:264 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:02:10 2019, skip new
snapshot

Tue Sep 17 21:07:10:264 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 45 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:07:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:07:10 2019


------------------------------------------------------------
Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10894535, readCount 10894535)


UPD : 0 (peak 31, writeCount 2278, readCount 2278)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080207, readCount 1080207)
SPO : 0 (peak 2, writeCount 11417, readCount 11417)
UP2 : 0 (peak 1, writeCount 1097, readCount 1097)
DISP: 0 (peak 67, writeCount 418546, readCount 418546)
GW : 0 (peak 45, writeCount 9947928, readCount 9947928)
ICM : 0 (peak 186, writeCount 197263, readCount 197263)
LWP : 0 (peak 15, writeCount 17413, readCount 17413)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:07:10 2019


------------------------------------------------------------

Current snapshot id: 45


DB clean time (in percent of total time) : 23.80 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |55 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |55 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:06:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 21:06:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:07:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2592| 48| |
|
| 1|DDLOG | 2592| 48| |
|
| 2|BTCSCHED | 5186| 49| |
|
| 3|RESTART_ALL | 1037| 166| |
|
| 4|ENVCHECK | 15561| 20| |
|
| 5|AUTOABAP | 1037| 166| |
|
| 6|BGRFC_WATCHDOG | 1038| 166| |
|
| 7|AUTOTH | 1267| 56| |
|
| 8|AUTOCCMS | 5186| 49| |
|
| 9|AUTOSECURITY | 5186| 49| |
|
| 10|LOAD_CALCULATION | 310842| 1| |
|
| 11|SPOOLALRM | 5187| 49| |
|
| 12|CALL_DELAYED | 0| 3979| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 45 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:07:30:265 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:07:50:266 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:08:10:267 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 2363

Tue Sep 17 21:08:17:551 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:08:30:267 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 2363 terminated
Tue Sep 17 21:08:50:268 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:09:10:269 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:09:30:270 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:09:50:271 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:10:10:271 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-3359
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-3360

Tue Sep 17 21:10:11:982 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3359) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3359) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3359)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3360) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3360) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 3360)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:10:30:271 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:10:50:272 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:11:10:273 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:11:30:274 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:11:50:274 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:12:10:274 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:12:30:274 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:12:50:275 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:08:10 2019, skip new
snapshot

Tue Sep 17 21:13:10:275 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 46 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:13:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10895432, readCount 10895432)


UPD : 0 (peak 31, writeCount 2280, readCount 2280)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080215, readCount 1080215)
SPO : 0 (peak 2, writeCount 11431, readCount 11431)
UP2 : 0 (peak 1, writeCount 1099, readCount 1099)
DISP: 0 (peak 67, writeCount 418587, readCount 418587)
GW : 0 (peak 45, writeCount 9948562, readCount 9948562)
ICM : 1 (peak 186, writeCount 197292, readCount 197291)
LWP : 2 (peak 15, writeCount 17443, readCount 17441)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:13:10 2019


------------------------------------------------------------

Current snapshot id: 46


DB clean time (in percent of total time) : 23.81 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |56 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |56 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|
| 16|404 |DIA |WP_RUN | | |norm|T61_U6372_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:12:57|6 |
SAPMSSY1 |norm| |
| | 4246|
|HTTP_NORMAL |T61_U6372_M0 |001|SM_EXTERN_WS|10.54.36.37 |21:13:09|16 |
SAPMHTTP |norm| |
| | 4590|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Tue Sep 17 21:12:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:13:10 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2595| 48| |
|
| 1|DDLOG | 2595| 48| |
|
| 2|BTCSCHED | 5192| 49| |
|
| 3|RESTART_ALL | 1038| 106| |
|
| 4|ENVCHECK | 15579| 20| |
|
| 5|AUTOABAP | 1038| 106| |
|
| 6|BGRFC_WATCHDOG | 1039| 106| |
|
| 7|AUTOTH | 1273| 56| |
|
| 8|AUTOCCMS | 5192| 49| |
|
| 9|AUTOSECURITY | 5192| 49| |
|
| 10|LOAD_CALCULATION | 311201| 1| |
|
| 11|SPOOLALRM | 5193| 49| |
|
| 12|CALL_DELAYED | 0| 3619| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 46 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:13:30:276 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:13:50:277 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:14:10:277 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 4552

Tue Sep 17 21:14:19:370 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:14:30:278 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 4552 terminated

Tue Sep 17 21:14:50:278 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:15:10:279 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-5339
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-5340

Tue Sep 17 21:15:11:987 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5339) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5339) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5339)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5340) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5340) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5340)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:15:30:280 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:15:50:281 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:16:10:282 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:16:30:283 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:16:50:284 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:17:10:284 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:17:30:285 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:17:50:286 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:18:10:286 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:18:30:287 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:18:50:288 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:14:10 2019, skip new
snapshot

Tue Sep 17 21:19:10:289 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 47 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:19:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10896277, readCount 10896277)


UPD : 0 (peak 31, writeCount 2281, readCount 2281)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080219, readCount 1080219)
SPO : 0 (peak 2, writeCount 11444, readCount 11444)
UP2 : 0 (peak 1, writeCount 1100, readCount 1100)
DISP: 0 (peak 67, writeCount 418627, readCount 418627)
GW : 0 (peak 45, writeCount 9949142, readCount 9949142)
ICM : 0 (peak 186, writeCount 197319, readCount 197319)
LWP : 2 (peak 15, writeCount 17458, readCount 17456)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:19:10 2019


------------------------------------------------------------

Current snapshot id: 47


DB clean time (in percent of total time) : 23.82 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |57 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |57 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:18:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 21:18:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:19:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2598| 48| |
|
| 1|DDLOG | 2598| 48| |
|
| 2|BTCSCHED | 5198| 49| |
|
| 3|RESTART_ALL | 1039| 46| |
|
| 4|ENVCHECK | 15597| 20| |
|
| 5|AUTOABAP | 1039| 46| |
|
| 6|BGRFC_WATCHDOG | 1040| 46| |
|
| 7|AUTOTH | 1279| 56| |
|
| 8|AUTOCCMS | 5198| 49| |
|
| 9|AUTOSECURITY | 5198| 49| |
|
| 10|LOAD_CALCULATION | 311560| 1| |
|
| 11|SPOOLALRM | 5199| 49| |
|
| 12|CALL_DELAYED | 0| 3259| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 47 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:19:30:289 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:19:50:290 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:20:10:291 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-6947
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-6948
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 6949

Tue Sep 17 21:20:11:975 2019


*** ERROR => DpHdlDeadWp: W1 (pid 6947) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6947) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 6947)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 6948) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6948) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 6948)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:20:19:378 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:20:30:291 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 6949 terminated

Tue Sep 17 21:20:50:291 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:21:10:292 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:21:30:293 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:21:50:294 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:22:10:294 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:22:30:294 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:22:50:295 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:23:10:295 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:23:30:296 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:23:50:297 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:24:10:297 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:24:30:298 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:24:50:299 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:20:10 2019, skip new
snapshot

Tue Sep 17 21:25:10:299 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 48 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:25:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10897311, readCount 10897311)


UPD : 0 (peak 31, writeCount 2282, readCount 2282)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080223, readCount 1080223)
SPO : 0 (peak 2, writeCount 11457, readCount 11457)
UP2 : 0 (peak 1, writeCount 1101, readCount 1101)
DISP: 0 (peak 67, writeCount 418672, readCount 418672)
GW : 0 (peak 45, writeCount 9949927, readCount 9949927)
ICM : 0 (peak 186, writeCount 197346, readCount 197346)
LWP : 2 (peak 15, writeCount 17473, readCount 17471)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:25:10 2019


------------------------------------------------------------

Current snapshot id: 48


DB clean time (in percent of total time) : 23.83 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |58 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |58 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:24:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 21:24:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:25:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2601| 48| |
|
| 1|DDLOG | 2601| 48| |
|
| 2|BTCSCHED | 5204| 49| |
|
| 3|RESTART_ALL | 1041| 286| |
|
| 4|ENVCHECK | 15615| 20| |
|
| 5|AUTOABAP | 1041| 286| |
|
| 6|BGRFC_WATCHDOG | 1042| 286| |
|
| 7|AUTOTH | 1285| 56| |
|
| 8|AUTOCCMS | 5204| 49| |
|
| 9|AUTOSECURITY | 5204| 49| |
|
| 10|LOAD_CALCULATION | 311918| 0| |
|
| 11|SPOOLALRM | 5205| 49| |
|
| 12|CALL_DELAYED | 0| 2899| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 48 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-8782
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:25:10:305 2019


DpWpDynCreate: created new work process W12-8783

Tue Sep 17 21:25:12:014 2019


*** ERROR => DpHdlDeadWp: W1 (pid 8782) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8782) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 8782)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 8783) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8783) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 8783)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:25:30:299 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:25:50:300 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:26:10:300 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9048

Tue Sep 17 21:26:19:386 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:26:30:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9048 terminated

Tue Sep 17 21:26:50:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:27:10:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:27:30:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:27:50:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:28:10:302 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:28:30:302 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:28:50:303 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:29:10:303 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:29:30:304 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:29:50:304 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:30:10:305 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-10799
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-10800

Tue Sep 17 21:30:12:027 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10799) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10799) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10799)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10800) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10800) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10800)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:30:30:314 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:30:50:315 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:26:10 2019, skip new
snapshot

Tue Sep 17 21:31:10:315 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 49 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:31:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2


Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10898145, readCount 10898145)


UPD : 0 (peak 31, writeCount 2283, readCount 2283)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080227, readCount 1080227)
SPO : 0 (peak 2, writeCount 11470, readCount 11470)
UP2 : 0 (peak 1, writeCount 1102, readCount 1102)
DISP: 0 (peak 67, writeCount 418713, readCount 418713)
GW : 0 (peak 45, writeCount 9950483, readCount 9950483)
ICM : 0 (peak 186, writeCount 197375, readCount 197375)
LWP : 0 (peak 15, writeCount 17488, readCount 17488)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:31:10 2019


------------------------------------------------------------

Current snapshot id: 49


DB clean time (in percent of total time) : 23.83 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |60 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |60 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:30:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 21:30:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:31:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2604| 48| |
|
| 1|DDLOG | 2604| 48| |
|
| 2|BTCSCHED | 5210| 49| |
|
| 3|RESTART_ALL | 1042| 226| |
|
| 4|ENVCHECK | 15633| 20| |
|
| 5|AUTOABAP | 1042| 226| |
|
| 6|BGRFC_WATCHDOG | 1043| 226| |
|
| 7|AUTOTH | 1291| 56| |
|
| 8|AUTOCCMS | 5210| 49| |
|
| 9|AUTOSECURITY | 5210| 49| |
|
| 10|LOAD_CALCULATION | 312277| 1| |
|
| 11|SPOOLALRM | 5211| 49| |
|
| 12|CALL_DELAYED | 0| 2539| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 49 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:31:30:315 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:31:50:315 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:32:10:315 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 11434

Tue Sep 17 21:32:19:617 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:32:30:316 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 11434 terminated

Tue Sep 17 21:32:50:317 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:33:10:318 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:33:30:319 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:33:50:319 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:34:10:320 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:34:30:320 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:34:50:321 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:35:10:321 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-12582
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-12583

Tue Sep 17 21:35:12:039 2019


*** ERROR => DpHdlDeadWp: W1 (pid 12582) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12582) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 12582)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 12583) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12583) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 12583)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:35:30:321 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:35:50:322 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:36:10:323 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:36:30:324 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot

Tue Sep 17 21:36:50:324 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:32:10 2019, skip new
snapshot
Tue Sep 17 21:37:10:324 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 50 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:37:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10898978, readCount 10898978)


UPD : 0 (peak 31, writeCount 2284, readCount 2284)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080231, readCount 1080231)
SPO : 0 (peak 2, writeCount 11483, readCount 11483)
UP2 : 0 (peak 1, writeCount 1103, readCount 1103)
DISP: 0 (peak 67, writeCount 418754, readCount 418754)
GW : 0 (peak 45, writeCount 9951051, readCount 9951051)
ICM : 0 (peak 186, writeCount 197402, readCount 197402)
LWP : 0 (peak 15, writeCount 17503, readCount 17503)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:37:10 2019


------------------------------------------------------------

Current snapshot id: 50


DB clean time (in percent of total time) : 23.84 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |61 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |61 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:36:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
Found 3 logons with 3 sessions
Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 21:36:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:37:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2607| 48| |
|
| 1|DDLOG | 2607| 48| |
|
| 2|BTCSCHED | 5216| 49| |
|
| 3|RESTART_ALL | 1043| 166| |
|
| 4|ENVCHECK | 15651| 20| |
|
| 5|AUTOABAP | 1043| 166| |
|
| 6|BGRFC_WATCHDOG | 1044| 166| |
|
| 7|AUTOTH | 1297| 56| |
|
| 8|AUTOCCMS | 5216| 49| |
|
| 9|AUTOSECURITY | 5216| 49| |
|
| 10|LOAD_CALCULATION | 312636| 1| |
|
| 11|SPOOLALRM | 5217| 49| |
|
| 12|CALL_DELAYED | 0| 2179| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 50 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:37:30:325 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:37:50:325 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:38:10:325 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 13386

Tue Sep 17 21:38:18:480 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:38:30:326 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 13386 terminated

Tue Sep 17 21:38:50:326 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:39:10:326 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:39:30:327 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:39:50:327 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:40:10:328 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-14482
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-14483

Tue Sep 17 21:40:12:047 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14482) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14482) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14482)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14483) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14483) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14483)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:40:30:328 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:40:50:328 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:41:10:329 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:41:30:330 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:41:50:330 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:42:10:331 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:42:30:332 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:42:50:333 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:38:10 2019, skip new
snapshot

Tue Sep 17 21:43:10:333 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 51 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:43:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:43:10 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10899979, readCount 10899979)


UPD : 0 (peak 31, writeCount 2286, readCount 2286)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080239, readCount 1080239)
SPO : 0 (peak 2, writeCount 11497, readCount 11497)
UP2 : 0 (peak 1, writeCount 1105, readCount 1105)
DISP: 0 (peak 67, writeCount 418795, readCount 418795)
GW : 0 (peak 45, writeCount 9951789, readCount 9951789)
ICM : 0 (peak 186, writeCount 197431, readCount 197431)
LWP : 2 (peak 15, writeCount 17533, readCount 17531)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:43:10 2019


------------------------------------------------------------

Current snapshot id: 51


DB clean time (in percent of total time) : 23.85 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |62 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 3|19909 |DIA |WP_RUN | | |norm|T66_U8513_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |62 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:42:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|HTTP_NORMAL |T66_U8513_M0 |001|SM_EXTERN_WS|10.54.36.37 |21:43:09|3 |
SAPMHTTP |norm| |
| | 4590|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
Found 4 logons with 4 sessions
Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 21:42:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:43:10 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2610| 48| |
|
| 1|DDLOG | 2610| 48| |
|
| 2|BTCSCHED | 5222| 49| |
|
| 3|RESTART_ALL | 1044| 106| |
|
| 4|ENVCHECK | 15669| 20| |
|
| 5|AUTOABAP | 1044| 106| |
|
| 6|BGRFC_WATCHDOG | 1045| 106| |
|
| 7|AUTOTH | 1303| 56| |
|
| 8|AUTOCCMS | 5222| 49| |
|
| 9|AUTOSECURITY | 5222| 49| |
|
| 10|LOAD_CALCULATION | 312995| 1| |
|
| 11|SPOOLALRM | 5223| 49| |
|
| 12|CALL_DELAYED | 0| 1819| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 51 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:43:30:333 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:43:50:334 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:44:10:334 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 15476

Tue Sep 17 21:44:17:349 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:44:30:334 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 15476 terminated

Tue Sep 17 21:44:50:335 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:45:10:336 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-16114
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-16115

Tue Sep 17 21:45:12:056 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16114) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16114) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16114)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16115) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16115) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16115)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:45:30:336 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:45:50:336 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:46:10:337 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:46:30:338 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:46:50:338 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:47:10:338 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:47:30:339 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:47:50:340 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:48:10:340 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:48:30:340 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:48:50:341 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:44:10 2019, skip new
snapshot

Tue Sep 17 21:49:10:341 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 52 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:49:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:49:10 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10900781, readCount 10900781)


UPD : 0 (peak 31, writeCount 2287, readCount 2287)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080243, readCount 1080243)
SPO : 0 (peak 2, writeCount 11510, readCount 11510)
UP2 : 0 (peak 1, writeCount 1106, readCount 1106)
DISP: 0 (peak 67, writeCount 418836, readCount 418836)
GW : 0 (peak 45, writeCount 9952333, readCount 9952333)
ICM : 0 (peak 186, writeCount 197458, readCount 197458)
LWP : 2 (peak 15, writeCount 17548, readCount 17546)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:49:10 2019


------------------------------------------------------------

Current snapshot id: 52


DB clean time (in percent of total time) : 23.86 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |63 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |63 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:48:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB
RFC-Connection Table (1 entries) Tue Sep 17 21:49:10 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 21:48:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:49:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2613| 48| |
|
| 1|DDLOG | 2613| 48| |
|
| 2|BTCSCHED | 5228| 49| |
|
| 3|RESTART_ALL | 1045| 46| |
|
| 4|ENVCHECK | 15687| 20| |
|
| 5|AUTOABAP | 1045| 46| |
|
| 6|BGRFC_WATCHDOG | 1046| 46| |
|
| 7|AUTOTH | 1309| 56| |
|
| 8|AUTOCCMS | 5228| 49| |
|
| 9|AUTOSECURITY | 5228| 49| |
|
| 10|LOAD_CALCULATION | 313354| 1| |
|
| 11|SPOOLALRM | 5229| 49| |
|
| 12|CALL_DELAYED | 0| 1459| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 52 (Reason: Workprocess 1 died / Time: Tue Sep 17
21:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:49:30:342 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:49:50:343 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:50:10:343 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-17647
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-17648
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 17649

Tue Sep 17 21:50:12:025 2019


*** ERROR => DpHdlDeadWp: W1 (pid 17647) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17647) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 17647)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 17648) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17648) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 17648)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:50:16:944 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:50:30:344 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 17649 terminated

Tue Sep 17 21:50:50:344 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:51:10:344 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:51:30:344 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:51:50:345 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:52:10:346 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:52:30:346 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:52:50:347 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:53:10:347 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:53:30:348 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:53:50:348 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:54:10:349 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:54:30:349 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:54:50:350 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:50:10 2019, skip new
snapshot

Tue Sep 17 21:55:10:350 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 53 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:55:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 21:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 21:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10901572, readCount 10901572)


UPD : 0 (peak 31, writeCount 2288, readCount 2288)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080247, readCount 1080247)
SPO : 0 (peak 2, writeCount 11523, readCount 11523)
UP2 : 0 (peak 1, writeCount 1107, readCount 1107)
DISP: 0 (peak 67, writeCount 418881, readCount 418881)
GW : 1 (peak 45, writeCount 9952880, readCount 9952879)
ICM : 0 (peak 186, writeCount 197485, readCount 197485)
LWP : 2 (peak 15, writeCount 17563, readCount 17561)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 21:55:10 2019


------------------------------------------------------------

Current snapshot id: 53


DB clean time (in percent of total time) : 23.87 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |64 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |64 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 21:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |21:54:57|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 21:55:10 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Tue Sep 17 21:54:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 21:55:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 21:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2616| 48| |
|
| 1|DDLOG | 2616| 48| |
|
| 2|BTCSCHED | 5234| 49| |
|
| 3|RESTART_ALL | 1047| 286| |
|
| 4|ENVCHECK | 15705| 20| |
|
| 5|AUTOABAP | 1047| 286| |
|
| 6|BGRFC_WATCHDOG | 1048| 286| |
|
| 7|AUTOTH | 1315| 56| |
|
| 8|AUTOCCMS | 5234| 49| |
|
| 9|AUTOSECURITY | 5234| 49| |
|
| 10|LOAD_CALCULATION | 313713| 1| |
|
| 11|SPOOLALRM | 5235| 49| |
|
| 12|CALL_DELAYED | 0| 1099| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 53 (Reason: Workprocess 1 died / Time: Tue Sep 17


21:55:10 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
DpWpDynCreate: created new work process W1-19251
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:55:10:355 2019


DpWpDynCreate: created new work process W12-19252

Tue Sep 17 21:55:11:998 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19251) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19251) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19251)

Tue Sep 17 21:55:12:000 2019


DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19252) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19252) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19252)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:55:30:351 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:55:50:351 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 21:56:10:351 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 19512

Tue Sep 17 21:56:17:003 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:56:30:352 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 19512 terminated
Tue Sep 17 21:56:50:353 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:57:10:353 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:57:30:354 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:57:50:355 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:58:10:355 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:58:30:356 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:58:50:357 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:59:10:358 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:59:30:358 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 21:59:50:358 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 22:00:10:359 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21260
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-21261

Tue Sep 17 22:00:11:963 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21260) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21260) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21260)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21261) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21261) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21261)
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 22:00:30:360 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 22:00:50:360 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 21:56:10 2019, skip new
snapshot

Tue Sep 17 22:01:10:360 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 54 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:01:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10902375, readCount 10902375)


UPD : 0 (peak 31, writeCount 2289, readCount 2289)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080251, readCount 1080251)
SPO : 0 (peak 2, writeCount 11536, readCount 11536)
UP2 : 0 (peak 1, writeCount 1108, readCount 1108)
DISP: 0 (peak 67, writeCount 418921, readCount 418921)
GW : 0 (peak 45, writeCount 9953408, readCount 9953408)
ICM : 0 (peak 186, writeCount 197512, readCount 197512)
LWP : 0 (peak 15, writeCount 17578, readCount 17578)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:01:10 2019


------------------------------------------------------------

Current snapshot id: 54


DB clean time (in percent of total time) : 23.88 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |66 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |66 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:00:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 22:00:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:01:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2619| 48| |
|
| 1|DDLOG | 2619| 48| |
|
| 2|BTCSCHED | 5240| 49| |
|
| 3|RESTART_ALL | 1048| 226| |
|
| 4|ENVCHECK | 15723| 20| |
|
| 5|AUTOABAP | 1048| 226| |
|
| 6|BGRFC_WATCHDOG | 1049| 226| |
|
| 7|AUTOTH | 1321| 56| |
|
| 8|AUTOCCMS | 5240| 49| |
|
| 9|AUTOSECURITY | 5240| 49| |
|
| 10|LOAD_CALCULATION | 314071| 1| |
|
| 11|SPOOLALRM | 5241| 49| |
|
| 12|CALL_DELAYED | 0| 739| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 54 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:01:30:361 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:01:50:362 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:02:10:362 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 25848

Tue Sep 17 22:02:18:339 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:02:30:363 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 25848 terminated

Tue Sep 17 22:02:50:364 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
Tue Sep 17 22:03:10:364 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:03:30:365 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:03:50:366 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:04:10:367 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:04:30:367 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:04:50:368 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:05:10:369 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-6246
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-6247

Tue Sep 17 22:05:12:103 2019


*** ERROR => DpHdlDeadWp: W1 (pid 6246) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6246) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 6246)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 6247) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6247) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 6247)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:05:30:370 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:05:50:370 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:06:10:371 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:06:30:371 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:06:50:371 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:02:10 2019, skip new
snapshot

Tue Sep 17 22:07:10:371 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 55 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:07:10 2019) - begin **********
Server smprd02_SMP_00, Tue Sep 17 22:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10903419, readCount 10903419)


UPD : 0 (peak 31, writeCount 2290, readCount 2290)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080255, readCount 1080255)
SPO : 0 (peak 2, writeCount 11549, readCount 11549)
UP2 : 0 (peak 1, writeCount 1109, readCount 1109)
DISP: 0 (peak 67, writeCount 418962, readCount 418962)
GW : 1 (peak 45, writeCount 9954174, readCount 9954173)
ICM : 0 (peak 186, writeCount 197539, readCount 197539)
LWP : 0 (peak 15, writeCount 17593, readCount 17593)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):
Requests in queue <GatewayQueue> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:07:10 2019


------------------------------------------------------------

Current snapshot id: 55


DB clean time (in percent of total time) : 23.89 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |67 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |67 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:06:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 22:06:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:07:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2622| 48| |
|
| 1|DDLOG | 2622| 48| |
|
| 2|BTCSCHED | 5246| 49| |
|
| 3|RESTART_ALL | 1049| 166| |
|
| 4|ENVCHECK | 15741| 20| |
|
| 5|AUTOABAP | 1049| 166| |
|
| 6|BGRFC_WATCHDOG | 1050| 166| |
|
| 7|AUTOTH | 1327| 56| |
|
| 8|AUTOCCMS | 5246| 49| |
|
| 9|AUTOSECURITY | 5246| 49| |
|
| 10|LOAD_CALCULATION | 314430| 1| |
|
| 11|SPOOLALRM | 5247| 49| |
|
| 12|CALL_DELAYED | 0| 379| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 55 (Reason: Workprocess 1 died / Time: Tue Sep 17
22:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:07:30:372 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:07:50:373 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:08:10:374 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 19158

Tue Sep 17 22:08:18:401 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:08:30:375 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 19158 terminated

Tue Sep 17 22:08:50:375 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:09:10:376 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
Tue Sep 17 22:09:30:377 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:09:50:377 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:10:10:378 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-20080
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-20081

Tue Sep 17 22:10:12:086 2019


*** ERROR => DpHdlDeadWp: W1 (pid 20080) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20080) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 20080)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 20081) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20081) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 20081)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:10:30:379 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:10:50:379 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:11:10:379 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:11:30:379 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:11:50:380 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:12:10:381 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:12:30:381 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:12:50:382 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:08:10 2019, skip new
snapshot

Tue Sep 17 22:13:10:382 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 56 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:13:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10904262, readCount 10904262)


UPD : 0 (peak 31, writeCount 2292, readCount 2292)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080263, readCount 1080263)
SPO : 0 (peak 2, writeCount 11563, readCount 11563)
UP2 : 0 (peak 1, writeCount 1111, readCount 1111)
DISP: 0 (peak 67, writeCount 419003, readCount 419003)
GW : 0 (peak 45, writeCount 9954762, readCount 9954762)
ICM : 1 (peak 186, writeCount 197568, readCount 197567)
LWP : 2 (peak 15, writeCount 17623, readCount 17621)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:13:10 2019


------------------------------------------------------------

Current snapshot id: 56


DB clean time (in percent of total time) : 23.90 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0|31517 |DIA |WP_RUN | | |norm|T24_U10526_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |68 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |68 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T24_U10526_M0 |001|SM_EXTERN_WS|10.54.36.37 |22:13:09|0 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:12:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB
RFC-Connection Table (1 entries) Tue Sep 17 22:13:10 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 22:12:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:13:10 2019


------------------------------------------------------------
Current pipes in use: 223
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2625| 48| |
|
| 1|DDLOG | 2625| 48| |
|
| 2|BTCSCHED | 5252| 49| |
|
| 3|RESTART_ALL | 1050| 106| |
|
| 4|ENVCHECK | 15759| 20| |
|
| 5|AUTOABAP | 1050| 106| |
|
| 6|BGRFC_WATCHDOG | 1051| 106| |
|
| 7|AUTOTH | 1333| 56| |
|
| 8|AUTOCCMS | 5252| 49| |
|
| 9|AUTOSECURITY | 5252| 49| |
|
| 10|LOAD_CALCULATION | 314789| 1| |
|
| 11|SPOOLALRM | 5253| 49| |
|
| 12|CALL_DELAYED | 0| 19| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 56 (Reason: Workprocess 1 died / Time: Tue Sep 17
22:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:13:30:383 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:13:50:384 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:14:10:384 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 21228

Tue Sep 17 22:14:18:329 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:14:30:384 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 21228 terminated

Tue Sep 17 22:14:50:385 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:15:10:386 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21872
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-21873

Tue Sep 17 22:15:12:088 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21872) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21872) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21872)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21873) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21873) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21873)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:15:30:387 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:15:50:387 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:16:10:388 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:16:30:389 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:16:50:390 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:17:10:391 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:17:30:391 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:17:50:391 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:18:10:391 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:18:30:391 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:18:50:392 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:14:10 2019, skip new
snapshot

Tue Sep 17 22:19:10:392 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 57 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:19:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10905099, readCount 10905099)


UPD : 0 (peak 31, writeCount 2293, readCount 2293)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080267, readCount 1080267)
SPO : 0 (peak 2, writeCount 11576, readCount 11576)
UP2 : 0 (peak 1, writeCount 1112, readCount 1112)
DISP: 0 (peak 67, writeCount 419048, readCount 419048)
GW : 1 (peak 45, writeCount 9955354, readCount 9955353)
ICM : 0 (peak 186, writeCount 197595, readCount 197595)
LWP : 2 (peak 15, writeCount 17638, readCount 17636)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:19:10 2019


------------------------------------------------------------

Current snapshot id: 57


DB clean time (in percent of total time) : 23.91 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |69 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |69 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:18:57|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:19:10 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 22:18:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:19:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2628| 48| |
|
| 1|DDLOG | 2628| 48| |
|
| 2|BTCSCHED | 5258| 49| |
|
| 3|RESTART_ALL | 1051| 46| |
|
| 4|ENVCHECK | 15777| 20| |
|
| 5|AUTOABAP | 1051| 46| |
|
| 6|BGRFC_WATCHDOG | 1052| 46| |
|
| 7|AUTOTH | 1339| 56| |
|
| 8|AUTOCCMS | 5258| 49| |
|
| 9|AUTOSECURITY | 5258| 49| |
|
| 10|LOAD_CALCULATION | 315148| 1| |
|
| 11|SPOOLALRM | 5259| 49| |
|
| 12|CALL_DELAYED | 0| 1195| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 57 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:19:10 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:19:30:393 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:19:50:394 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:20:10:394 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-23431
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-23432
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23433

Tue Sep 17 22:20:12:063 2019


*** ERROR => DpHdlDeadWp: W1 (pid 23431) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23431) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 23431)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 23432) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23432) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 23432)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:20:18:348 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:20:30:394 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 23433 terminated

Tue Sep 17 22:20:50:395 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:21:10:396 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:21:30:397 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:21:50:397 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:22:10:397 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:22:30:398 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:22:50:398 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:23:10:399 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:23:30:399 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:23:50:399 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:24:10:400 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:24:30:401 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:24:50:401 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:20:10 2019, skip new
snapshot

Tue Sep 17 22:25:10:402 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 58 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:25:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10905928, readCount 10905928)


UPD : 0 (peak 31, writeCount 2294, readCount 2294)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080271, readCount 1080271)
SPO : 0 (peak 2, writeCount 11589, readCount 11589)
UP2 : 0 (peak 1, writeCount 1113, readCount 1113)
DISP: 0 (peak 67, writeCount 419093, readCount 419093)
GW : 0 (peak 45, writeCount 9955915, readCount 9955915)
ICM : 0 (peak 186, writeCount 197622, readCount 197622)
LWP : 2 (peak 15, writeCount 17653, readCount 17651)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:


Queue <DispatcherQueue> in slot 0 (port=18802) has no requests
Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:25:10 2019


------------------------------------------------------------

Current snapshot id: 58


DB clean time (in percent of total time) : 23.92 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |70 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |70 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:24:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 22:24:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:25:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2631| 48| |
|
| 1|DDLOG | 2631| 48| |
|
| 2|BTCSCHED | 5264| 49| |
|
| 3|RESTART_ALL | 1053| 286| |
|
| 4|ENVCHECK | 15795| 20| |
|
| 5|AUTOABAP | 1053| 286| |
|
| 6|BGRFC_WATCHDOG | 1054| 286| |
|
| 7|AUTOTH | 1345| 56| |
|
| 8|AUTOCCMS | 5264| 49| |
|
| 9|AUTOSECURITY | 5264| 49| |
|
| 10|LOAD_CALCULATION | 315506| 0| |
|
| 11|SPOOLALRM | 5265| 49| |
|
| 12|CALL_DELAYED | 0| 835| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 58 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-25387
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 22:25:10:409 2019
DpWpDynCreate: created new work process W12-25388

Tue Sep 17 22:25:12:144 2019


*** ERROR => DpHdlDeadWp: W1 (pid 25387) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25387) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25387)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 25388) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25388) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 25388)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:25:30:403 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:25:50:404 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:26:10:405 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 25681

Tue Sep 17 22:26:18:249 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:26:30:406 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 25681 terminated

Tue Sep 17 22:26:50:406 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:27:10:406 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:27:30:407 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:27:50:408 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:28:10:408 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:28:30:408 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:28:50:409 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:29:10:409 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:29:30:410 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:29:50:411 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:30:10:412 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-27332
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-27333

Tue Sep 17 22:30:12:135 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27332) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27332) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27332)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27333) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27333) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27333)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:30:30:412 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:30:50:413 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:26:10 2019, skip new
snapshot

Tue Sep 17 22:31:10:413 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 59 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:31:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10906743, readCount 10906743)


UPD : 0 (peak 31, writeCount 2295, readCount 2295)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080275, readCount 1080275)
SPO : 0 (peak 2, writeCount 11602, readCount 11602)
UP2 : 0 (peak 1, writeCount 1114, readCount 1114)
DISP: 0 (peak 67, writeCount 419134, readCount 419134)
GW : 0 (peak 45, writeCount 9956459, readCount 9956459)
ICM : 1 (peak 186, writeCount 197651, readCount 197650)
LWP : 0 (peak 15, writeCount 17668, readCount 17668)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:31:10 2019


------------------------------------------------------------

Current snapshot id: 59


DB clean time (in percent of total time) : 23.92 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |72 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |72 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:30:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB
RFC-Connection Table (1 entries) Tue Sep 17 22:31:10 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 22:30:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:31:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2634| 48| |
|
| 1|DDLOG | 2634| 48| |
|
| 2|BTCSCHED | 5270| 49| |
|
| 3|RESTART_ALL | 1054| 226| |
|
| 4|ENVCHECK | 15813| 20| |
|
| 5|AUTOABAP | 1054| 226| |
|
| 6|BGRFC_WATCHDOG | 1055| 226| |
|
| 7|AUTOTH | 1351| 56| |
|
| 8|AUTOCCMS | 5270| 49| |
|
| 9|AUTOSECURITY | 5270| 49| |
|
| 10|LOAD_CALCULATION | 315865| 1| |
|
| 11|SPOOLALRM | 5271| 49| |
|
| 12|CALL_DELAYED | 0| 475| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 59 (Reason: Workprocess 1 died / Time: Tue Sep 17
22:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:31:30:414 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:31:50:415 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:32:10:415 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 27861

Tue Sep 17 22:32:17:386 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:32:30:416 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 27861 terminated

Tue Sep 17 22:32:50:416 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:33:10:417 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:33:30:417 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:33:50:418 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:34:10:418 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:34:30:418 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:34:50:418 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:35:10:419 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-28990
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-28991

Tue Sep 17 22:35:12:137 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28990) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28990) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28990)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 28991) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28991) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28991)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:35:30:419 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:35:50:420 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:36:10:420 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:36:30:421 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:36:50:421 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:32:10 2019, skip new
snapshot

Tue Sep 17 22:37:10:421 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 60 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:37:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:37:10 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10907600, readCount 10907600)


UPD : 0 (peak 31, writeCount 2296, readCount 2296)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080279, readCount 1080279)
SPO : 0 (peak 2, writeCount 11615, readCount 11615)
UP2 : 0 (peak 1, writeCount 1115, readCount 1115)
DISP: 0 (peak 67, writeCount 419175, readCount 419175)
GW : 0 (peak 45, writeCount 9957049, readCount 9957049)
ICM : 0 (peak 186, writeCount 197678, readCount 197678)
LWP : 0 (peak 15, writeCount 17683, readCount 17683)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:


Queue <DispatcherQueue> in slot 0 (port=18802) has no requests
Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:37:10 2019


------------------------------------------------------------

Current snapshot id: 60


DB clean time (in percent of total time) : 23.93 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |73 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |73 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:36:57|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 22:36:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:37:10 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2637| 48| |
|
| 1|DDLOG | 2637| 48| |
|
| 2|BTCSCHED | 5276| 49| |
|
| 3|RESTART_ALL | 1055| 166| |
|
| 4|ENVCHECK | 15831| 20| |
|
| 5|AUTOABAP | 1055| 166| |
|
| 6|BGRFC_WATCHDOG | 1056| 166| |
|
| 7|AUTOTH | 1357| 56| |
|
| 8|AUTOCCMS | 5276| 49| |
|
| 9|AUTOSECURITY | 5276| 49| |
|
| 10|LOAD_CALCULATION | 316224| 1| |
|
| 11|SPOOLALRM | 5277| 49| |
|
| 12|CALL_DELAYED | 0| 115| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 60 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 22:37:30:422 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:37:50:422 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:38:10:423 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 29886

Tue Sep 17 22:38:17:446 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:38:30:424 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 29886 terminated

Tue Sep 17 22:38:50:424 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:39:10:425 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:39:30:426 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:39:50:426 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:40:10:426 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-30753
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-30754

Tue Sep 17 22:40:12:148 2019


*** ERROR => DpHdlDeadWp: W1 (pid 30753) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30753) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 30753)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 30754) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30754) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 30754)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:40:30:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:40:50:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:41:10:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:41:30:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:41:50:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:42:10:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:42:30:428 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:42:50:429 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:38:10 2019, skip new
snapshot

Tue Sep 17 22:43:10:429 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 61 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:43:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10908418, readCount 10908418)


UPD : 0 (peak 31, writeCount 2298, readCount 2298)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080287, readCount 1080287)
SPO : 0 (peak 2, writeCount 11629, readCount 11629)
UP2 : 0 (peak 1, writeCount 1117, readCount 1117)
DISP: 0 (peak 67, writeCount 419217, readCount 419217)
GW : 1 (peak 45, writeCount 9957595, readCount 9957594)
ICM : 0 (peak 186, writeCount 197707, readCount 197707)
LWP : 2 (peak 15, writeCount 17713, readCount 17711)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:43:10 2019


------------------------------------------------------------

Current snapshot id: 61


DB clean time (in percent of total time) : 23.94 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |74 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |74 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:42:57|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 22:42:57 2019 |

Found 1 RFC-Connections
CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:43:10 2019


------------------------------------------------------------
Current pipes in use: 219
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2640| 48| |
|
| 1|DDLOG | 2640| 48| |
|
| 2|BTCSCHED | 5282| 49| |
|
| 3|RESTART_ALL | 1056| 106| |
|
| 4|ENVCHECK | 15849| 20| |
|
| 5|AUTOABAP | 1056| 106| |
|
| 6|BGRFC_WATCHDOG | 1057| 106| |
|
| 7|AUTOTH | 1363| 56| |
|
| 8|AUTOCCMS | 5282| 49| |
|
| 9|AUTOSECURITY | 5282| 49| |
|
| 10|LOAD_CALCULATION | 316583| 1| |
|
| 11|SPOOLALRM | 5283| 49| |
|
| 12|CALL_DELAYED | 0| 12665| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 61 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:43:30:430 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:43:50:430 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:44:10:431 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 31841

Tue Sep 17 22:44:17:510 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:44:30:431 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 31841 terminated

Tue Sep 17 22:44:50:432 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:45:10:433 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-32614
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-32615

Tue Sep 17 22:45:12:161 2019


*** ERROR => DpHdlDeadWp: W1 (pid 32614) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=32614) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 32614)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 32615) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=32615) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 32615)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:45:30:433 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:45:50:434 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:46:10:435 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:46:30:436 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:46:50:436 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:47:10:437 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:47:30:438 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:47:50:438 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:48:10:439 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:48:30:439 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:48:50:439 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:44:10 2019, skip new
snapshot

Tue Sep 17 22:49:10:440 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 62 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:49:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10909230, readCount 10909230)


UPD : 0 (peak 31, writeCount 2299, readCount 2299)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080291, readCount 1080291)
SPO : 0 (peak 2, writeCount 11642, readCount 11642)
UP2 : 0 (peak 1, writeCount 1118, readCount 1118)
DISP: 0 (peak 67, writeCount 419258, readCount 419258)
GW : 0 (peak 45, writeCount 9958147, readCount 9958147)
ICM : 0 (peak 186, writeCount 197734, readCount 197734)
LWP : 2 (peak 15, writeCount 17728, readCount 17726)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:49:10 2019


------------------------------------------------------------
Current snapshot id: 62
DB clean time (in percent of total time) : 23.95 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |75 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |75 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:48:57|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 22:48:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Tue Sep 17 22:49:10 2019
------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 22:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2643| 48| |
|
| 1|DDLOG | 2643| 48| |
|
| 2|BTCSCHED | 5288| 49| |
|
| 3|RESTART_ALL | 1057| 46| |
|
| 4|ENVCHECK | 15867| 20| |
|
| 5|AUTOABAP | 1057| 46| |
|
| 6|BGRFC_WATCHDOG | 1058| 46| |
|
| 7|AUTOTH | 1369| 56| |
|
| 8|AUTOCCMS | 5288| 49| |
|
| 9|AUTOSECURITY | 5288| 49| |
|
| 10|LOAD_CALCULATION | 316942| 1| |
|
| 11|SPOOLALRM | 5289| 49| |
|
| 12|CALL_DELAYED | 0| 12305| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 62 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:49:30:441 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:49:50:442 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:50:10:443 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-1636
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-1637
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 1638

Tue Sep 17 22:50:12:129 2019


*** ERROR => DpHdlDeadWp: W1 (pid 1636) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1636) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 1636)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 1637) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1637) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 1637)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:50:17:570 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:50:30:444 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 1638 terminated

Tue Sep 17 22:50:50:444 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:51:10:445 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:51:30:445 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:51:50:445 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:52:10:446 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:52:30:447 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:52:50:447 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:53:10:448 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:53:30:448 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:53:50:449 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:54:10:449 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:54:30:450 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:54:50:451 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:50:10 2019, skip new
snapshot

Tue Sep 17 22:55:10:451 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 63 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:55:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 22:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 22:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10910020, readCount 10910020)


UPD : 0 (peak 31, writeCount 2300, readCount 2300)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080295, readCount 1080295)
SPO : 0 (peak 2, writeCount 11655, readCount 11655)
UP2 : 0 (peak 1, writeCount 1119, readCount 1119)
DISP: 0 (peak 67, writeCount 419303, readCount 419303)
GW : 0 (peak 45, writeCount 9958702, readCount 9958702)
ICM : 0 (peak 186, writeCount 197761, readCount 197761)
LWP : 2 (peak 15, writeCount 17743, readCount 17741)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 22:55:10 2019


------------------------------------------------------------

Current snapshot id: 63


DB clean time (in percent of total time) : 23.96 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |76 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |76 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 22:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |22:54:57|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 22:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 22:54:57 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 22:55:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884
Periodic Tasks Tue Sep 17 22:55:10 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2646| 48| |
|
| 1|DDLOG | 2646| 48| |
|
| 2|BTCSCHED | 5294| 49| |
|
| 3|RESTART_ALL | 1059| 286| |
|
| 4|ENVCHECK | 15885| 20| |
|
| 5|AUTOABAP | 1059| 286| |
|
| 6|BGRFC_WATCHDOG | 1060| 286| |
|
| 7|AUTOTH | 1375| 56| |
|
| 8|AUTOCCMS | 5294| 49| |
|
| 9|AUTOSECURITY | 5294| 49| |
|
| 10|LOAD_CALCULATION | 317300| 0| |
|
| 11|SPOOLALRM | 5295| 49| |
|
| 12|CALL_DELAYED | 0| 11945| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 63 (Reason: Workprocess 1 died / Time: Tue Sep 17


22:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-3667
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:55:10:457 2019


DpWpDynCreate: created new work process W12-3668

Tue Sep 17 22:55:12:233 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3667) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3667) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3667)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3668) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3668) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 3668)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:55:30:452 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:55:50:452 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 22:56:10:453 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 3946

Tue Sep 17 22:56:17:545 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:56:30:453 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 3946 terminated

Tue Sep 17 22:56:50:454 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:57:10:454 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:57:30:455 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:57:50:455 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:58:10:456 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:58:30:456 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:58:50:457 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:59:10:457 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:59:30:458 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 22:59:50:458 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 23:00:10:459 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-5404
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-5405

Tue Sep 17 23:00:12:209 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5404) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5404) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5404)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5405) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5405) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5405)
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 23:00:30:460 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 23:00:50:460 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 22:56:10 2019, skip new
snapshot

Tue Sep 17 23:01:10:461 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 64 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:01:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10910848, readCount 10910848)


UPD : 0 (peak 31, writeCount 2301, readCount 2301)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080299, readCount 1080299)
SPO : 0 (peak 2, writeCount 11668, readCount 11668)
UP2 : 0 (peak 1, writeCount 1120, readCount 1120)
DISP: 0 (peak 67, writeCount 419344, readCount 419344)
GW : 1 (peak 45, writeCount 9959246, readCount 9959245)
ICM : 0 (peak 186, writeCount 197788, readCount 197788)
LWP : 0 (peak 15, writeCount 17758, readCount 17758)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests
Workprocess Table (long) Tue Sep 17 23:01:10 2019
------------------------------------------------------------

Current snapshot id: 64


DB clean time (in percent of total time) : 23.97 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |78 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |78 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:00:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 23:00:58 2019 |

Found 1 RFC-Connections
CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:01:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2649| 48| |
|
| 1|DDLOG | 2649| 48| |
|
| 2|BTCSCHED | 5300| 49| |
|
| 3|RESTART_ALL | 1060| 226| |
|
| 4|ENVCHECK | 15903| 20| |
|
| 5|AUTOABAP | 1060| 226| |
|
| 6|BGRFC_WATCHDOG | 1061| 226| |
|
| 7|AUTOTH | 1381| 56| |
|
| 8|AUTOCCMS | 5300| 49| |
|
| 9|AUTOSECURITY | 5300| 49| |
|
| 10|LOAD_CALCULATION | 317659| 1| |
|
| 11|SPOOLALRM | 5301| 49| |
|
| 12|CALL_DELAYED | 0| 11585| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 64 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:01:30:462 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:01:50:462 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:02:10:463 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 8740

Tue Sep 17 23:02:16:918 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:02:30:464 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 8740 terminated

Tue Sep 17 23:02:50:464 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:03:10:465 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:03:30:466 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:03:50:466 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:04:10:468 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:04:30:468 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:04:50:468 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:05:10:468 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21127
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-21128

Tue Sep 17 23:05:12:174 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21127) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21127) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21127)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21128) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21128) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21128)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:05:30:469 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:05:50:469 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:06:10:470 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:06:30:470 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:06:50:471 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:02:10 2019, skip new
snapshot

Tue Sep 17 23:07:10:471 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 65 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:07:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10911910, readCount 10911910)


UPD : 0 (peak 31, writeCount 2302, readCount 2302)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080303, readCount 1080303)
SPO : 0 (peak 2, writeCount 11681, readCount 11681)
UP2 : 0 (peak 1, writeCount 1121, readCount 1121)
DISP: 0 (peak 67, writeCount 419385, readCount 419385)
GW : 0 (peak 45, writeCount 9960036, readCount 9960036)
ICM : 0 (peak 186, writeCount 197815, readCount 197815)
LWP : 0 (peak 15, writeCount 17773, readCount 17773)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:07:10 2019


------------------------------------------------------------

Current snapshot id: 65


DB clean time (in percent of total time) : 23.98 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |79 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |79 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:06:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 23:06:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:07:10 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884
Periodic Tasks Tue Sep 17 23:07:10 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2652| 48| |
|
| 1|DDLOG | 2652| 48| |
|
| 2|BTCSCHED | 5306| 49| |
|
| 3|RESTART_ALL | 1061| 166| |
|
| 4|ENVCHECK | 15921| 20| |
|
| 5|AUTOABAP | 1061| 166| |
|
| 6|BGRFC_WATCHDOG | 1062| 166| |
|
| 7|AUTOTH | 1387| 56| |
|
| 8|AUTOCCMS | 5306| 49| |
|
| 9|AUTOSECURITY | 5306| 49| |
|
| 10|LOAD_CALCULATION | 318018| 1| |
|
| 11|SPOOLALRM | 5307| 49| |
|
| 12|CALL_DELAYED | 0| 11225| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 65 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:07:30:472 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:07:50:472 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:08:10:473 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 1059

Tue Sep 17 23:08:16:949 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:08:30:474 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 1059 terminated

Tue Sep 17 23:08:50:475 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:09:10:476 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:09:30:476 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:09:50:477 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:10:10:478 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-4384
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-4385

Tue Sep 17 23:10:12:180 2019


*** ERROR => DpHdlDeadWp: W1 (pid 4384) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4384) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 4384)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 4385) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4385) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 4385)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:10:30:478 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:10:50:479 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:11:10:479 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:11:30:480 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:11:50:480 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:12:10:481 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:12:30:482 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:12:50:482 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:08:10 2019, skip new
snapshot

Tue Sep 17 23:13:10:482 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 66 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:13:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10912764, readCount 10912764)


UPD : 0 (peak 31, writeCount 2304, readCount 2304)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080311, readCount 1080311)
SPO : 0 (peak 2, writeCount 11695, readCount 11695)
UP2 : 0 (peak 1, writeCount 1123, readCount 1123)
DISP: 0 (peak 67, writeCount 419426, readCount 419426)
GW : 1 (peak 45, writeCount 9960640, readCount 9960639)
ICM : 0 (peak 186, writeCount 197844, readCount 197844)
LWP : 2 (peak 15, writeCount 17803, readCount 17801)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 245 (rq_id 29280499, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:13:10 2019


------------------------------------------------------------

Current snapshot id: 66


DB clean time (in percent of total time) : 23.99 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |80 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 4|19804 |DIA |WP_RUN | | |norm|T32_U14518_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |80 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T32_U14518_M0 |001|SM_EXTERN_WS|10.54.36.37 |23:13:09|4 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:12:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Tue Sep 17 23:12:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Tue Sep 17 23:13:10 2019
------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2655| 48| |
|
| 1|DDLOG | 2655| 48| |
|
| 2|BTCSCHED | 5312| 49| |
|
| 3|RESTART_ALL | 1062| 106| |
|
| 4|ENVCHECK | 15939| 20| |
|
| 5|AUTOABAP | 1062| 106| |
|
| 6|BGRFC_WATCHDOG | 1063| 106| |
|
| 7|AUTOTH | 1393| 56| |
|
| 8|AUTOCCMS | 5312| 49| |
|
| 9|AUTOSECURITY | 5312| 49| |
|
| 10|LOAD_CALCULATION | 318377| 1| |
|
| 11|SPOOLALRM | 5313| 49| |
|
| 12|CALL_DELAYED | 0| 10865| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 66 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:13:30:483 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:13:50:484 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:14:10:484 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 5683

Tue Sep 17 23:14:17:179 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:14:30:485 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 5683 terminated

Tue Sep 17 23:14:50:485 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:15:10:485 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-6403
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-6404

Tue Sep 17 23:15:11:937 2019


*** ERROR => DpHdlDeadWp: W1 (pid 6403) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6403) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 6403)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 6404) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6404) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 6404)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
Tue Sep 17 23:15:30:486 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:15:50:487 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:16:10:487 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:16:30:488 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:16:50:488 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:17:10:488 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:17:30:488 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:17:50:489 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:18:10:490 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:18:30:491 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:18:50:492 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:14:10 2019, skip new
snapshot

Tue Sep 17 23:19:10:492 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 67 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:19:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10913629, readCount 10913629)


UPD : 0 (peak 31, writeCount 2305, readCount 2305)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080315, readCount 1080315)
SPO : 0 (peak 2, writeCount 11708, readCount 11708)
UP2 : 0 (peak 1, writeCount 1124, readCount 1124)
DISP: 0 (peak 67, writeCount 419467, readCount 419467)
GW : 0 (peak 45, writeCount 9961232, readCount 9961232)
ICM : 1 (peak 186, writeCount 197871, readCount 197870)
LWP : 2 (peak 15, writeCount 17818, readCount 17816)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:19:10 2019


------------------------------------------------------------

Current snapshot id: 67


DB clean time (in percent of total time) : 23.99 %
Number of preemptions : 78
|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|
Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |81 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |81 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:18:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 23:18:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:19:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2658| 48| |
|
| 1|DDLOG | 2658| 48| |
|
| 2|BTCSCHED | 5318| 49| |
|
| 3|RESTART_ALL | 1063| 46| |
|
| 4|ENVCHECK | 15957| 20| |
|
| 5|AUTOABAP | 1063| 46| |
|
| 6|BGRFC_WATCHDOG | 1064| 46| |
|
| 7|AUTOTH | 1399| 56| |
|
| 8|AUTOCCMS | 5318| 49| |
|
| 9|AUTOSECURITY | 5318| 49| |
|
| 10|LOAD_CALCULATION | 318735| 1| |
|
| 11|SPOOLALRM | 5319| 49| |
|
| 12|CALL_DELAYED | 0| 10505| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 67 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:19:30:492 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:19:50:493 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 23:20:10:494 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-7762
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-7763
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 7764

Tue Sep 17 23:20:12:200 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7762) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7762) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7762)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7763) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7763) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7763)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:20:16:954 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:20:30:495 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 7764 terminated

Tue Sep 17 23:20:50:496 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:21:10:497 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:21:30:497 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:21:50:498 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:22:10:498 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:22:30:499 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:22:50:500 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:23:10:500 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:23:30:501 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:23:50:501 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:24:10:501 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:24:30:502 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:24:50:503 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:20:10 2019, skip new
snapshot

Tue Sep 17 23:25:10:503 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 68 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:25:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10914628, readCount 10914628)


UPD : 0 (peak 31, writeCount 2306, readCount 2306)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080319, readCount 1080319)
SPO : 0 (peak 2, writeCount 11721, readCount 11721)
UP2 : 0 (peak 1, writeCount 1125, readCount 1125)
DISP: 0 (peak 67, writeCount 419511, readCount 419511)
GW : 0 (peak 45, writeCount 9962001, readCount 9962001)
ICM : 0 (peak 186, writeCount 197898, readCount 197898)
LWP : 2 (peak 15, writeCount 17833, readCount 17831)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:25:10 2019


------------------------------------------------------------

Current snapshot id: 68


DB clean time (in percent of total time) : 24.00 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |82 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |82 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:24:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Tue Sep 17 23:24:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:25:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:25:10 2019


------------------------------------------------------------
|Handle |Type |Calls |Wait(sec) |Session |Resp-ID
|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2661| 48| |
|
| 1|DDLOG | 2661| 48| |
|
| 2|BTCSCHED | 5324| 49| |
|
| 3|RESTART_ALL | 1065| 286| |
|
| 4|ENVCHECK | 15975| 20| |
|
| 5|AUTOABAP | 1065| 286| |
|
| 6|BGRFC_WATCHDOG | 1066| 286| |
|
| 7|AUTOTH | 1405| 56| |
|
| 8|AUTOCCMS | 5324| 49| |
|
| 9|AUTOSECURITY | 5324| 49| |
|
| 10|LOAD_CALCULATION | 319094| 1| |
|
| 11|SPOOLALRM | 5325| 49| |
|
| 12|CALL_DELAYED | 0| 10145| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 68 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-9566
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:25:10:509 2019


DpWpDynCreate: created new work process W12-9567

Tue Sep 17 23:25:12:236 2019


*** ERROR => DpHdlDeadWp: W1 (pid 9566) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9566) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 9566)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 9567) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9567) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 9567)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:25:30:504 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:25:50:505 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:26:10:506 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9834

Tue Sep 17 23:26:17:330 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:26:30:506 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9834 terminated

Tue Sep 17 23:26:50:506 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:27:10:506 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:27:30:507 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
Tue Sep 17 23:27:50:507 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:28:10:508 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:28:30:509 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:28:50:510 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:29:10:510 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:29:30:511 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:29:50:512 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:30:10:512 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-11408
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-11409

Tue Sep 17 23:30:12:220 2019


*** ERROR => DpHdlDeadWp: W1 (pid 11408) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11408) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 11408)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 11409) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11409) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 11409)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:30:30:513 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:30:50:514 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:26:10 2019, skip new
snapshot

Tue Sep 17 23:31:10:515 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 69 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:31:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10915492, readCount 10915492)


UPD : 0 (peak 31, writeCount 2307, readCount 2307)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080323, readCount 1080323)
SPO : 0 (peak 2, writeCount 11734, readCount 11734)
UP2 : 0 (peak 1, writeCount 1126, readCount 1126)
DISP: 0 (peak 67, writeCount 419552, readCount 419552)
GW : 0 (peak 45, writeCount 9962565, readCount 9962565)
ICM : 0 (peak 186, writeCount 197927, readCount 197927)
LWP : 0 (peak 15, writeCount 17848, readCount 17848)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:31:10 2019


------------------------------------------------------------

Current snapshot id: 69


DB clean time (in percent of total time) : 24.01 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |84 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |84 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:30:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 23:30:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:31:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2664| 48| |
|
| 1|DDLOG | 2664| 48| |
|
| 2|BTCSCHED | 5330| 49| |
|
| 3|RESTART_ALL | 1066| 226| |
|
| 4|ENVCHECK | 15993| 20| |
|
| 5|AUTOABAP | 1066| 226| |
|
| 6|BGRFC_WATCHDOG | 1067| 226| |
|
| 7|AUTOTH | 1411| 56| |
|
| 8|AUTOCCMS | 5330| 49| |
|
| 9|AUTOSECURITY | 5330| 49| |
|
| 10|LOAD_CALCULATION | 319453| 1| |
|
| 11|SPOOLALRM | 5331| 49| |
|
| 12|CALL_DELAYED | 0| 9785| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 69 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:31:30:516 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:31:50:517 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:32:10:517 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 11999

Tue Sep 17 23:32:17:568 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:32:30:518 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 11999 terminated

Tue Sep 17 23:32:50:518 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:33:10:519 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:33:30:519 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:33:50:520 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
Tue Sep 17 23:34:10:520 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:34:30:520 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:34:50:521 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:35:10:523 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-13370
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-13371

Tue Sep 17 23:35:12:244 2019


*** ERROR => DpHdlDeadWp: W1 (pid 13370) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13370) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 13370)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 13371) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13371) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13371)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:35:30:523 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:35:50:524 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:36:10:525 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:36:30:526 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:36:50:526 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:32:10 2019, skip new
snapshot

Tue Sep 17 23:37:10:527 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 70 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:37:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10916377, readCount 10916377)


UPD : 0 (peak 31, writeCount 2308, readCount 2308)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080327, readCount 1080327)
SPO : 0 (peak 2, writeCount 11747, readCount 11747)
UP2 : 0 (peak 1, writeCount 1127, readCount 1127)
DISP: 0 (peak 67, writeCount 419593, readCount 419593)
GW : 0 (peak 45, writeCount 9963181, readCount 9963181)
ICM : 1 (peak 186, writeCount 197954, readCount 197953)
LWP : 0 (peak 15, writeCount 17863, readCount 17863)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Dump of queue <IcmanQueue> in slot 2 (1 requests, in use, port=27708):
-1 <- 283 (rq_id 29290482, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:37:10 2019


------------------------------------------------------------

Current snapshot id: 70


DB clean time (in percent of total time) : 24.02 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |85 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |85 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:36:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Tue Sep 17 23:36:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:37:10 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2667| 48| |
|
| 1|DDLOG | 2667| 48| |
|
| 2|BTCSCHED | 5336| 49| |
|
| 3|RESTART_ALL | 1067| 166| |
|
| 4|ENVCHECK | 16011| 20| |
|
| 5|AUTOABAP | 1067| 166| |
|
| 6|BGRFC_WATCHDOG | 1068| 166| |
|
| 7|AUTOTH | 1417| 56| |
|
| 8|AUTOCCMS | 5336| 49| |
|
| 9|AUTOSECURITY | 5336| 49| |
|
| 10|LOAD_CALCULATION | 319812| 1| |
|
| 11|SPOOLALRM | 5337| 49| |
|
| 12|CALL_DELAYED | 0| 9425| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 70 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:37:30:527 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:37:50:527 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:38:10:528 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 14175

Tue Sep 17 23:38:17:998 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:38:30:529 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 14175 terminated

Tue Sep 17 23:38:50:529 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:39:10:530 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:39:30:531 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:39:50:531 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:40:10:532 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-15041
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-15042

Tue Sep 17 23:40:12:253 2019


*** ERROR => DpHdlDeadWp: W1 (pid 15041) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15041) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 15041)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 15042) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15042) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 15042)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:40:30:533 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:40:50:534 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:41:10:534 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:41:30:535 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:41:50:536 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:42:10:537 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:42:30:537 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:42:50:538 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:38:10 2019, skip new
snapshot

Tue Sep 17 23:43:10:538 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 71 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:43:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10917202, readCount 10917202)


UPD : 0 (peak 31, writeCount 2310, readCount 2310)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080335, readCount 1080335)
SPO : 0 (peak 2, writeCount 11761, readCount 11761)
UP2 : 0 (peak 1, writeCount 1129, readCount 1129)
DISP: 0 (peak 67, writeCount 419634, readCount 419634)
GW : 0 (peak 45, writeCount 9963749, readCount 9963749)
ICM : 1 (peak 186, writeCount 197983, readCount 197982)
LWP : 2 (peak 15, writeCount 17893, readCount 17891)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Dump of queue <IcmanQueue> in slot 2 (1 requests, in use, port=27708):
-1 <- 273 (rq_id 29292814, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:43:10 2019


------------------------------------------------------------

Current snapshot id: 71


DB clean time (in percent of total time) : 24.03 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |86 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |86 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:42:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Tue Sep 17 23:42:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:43:10 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:43:10 2019


------------------------------------------------------------
|Handle |Type |Calls |Wait(sec) |Session |Resp-ID
|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2670| 48| |
|
| 1|DDLOG | 2670| 48| |
|
| 2|BTCSCHED | 5342| 49| |
|
| 3|RESTART_ALL | 1068| 106| |
|
| 4|ENVCHECK | 16029| 20| |
|
| 5|AUTOABAP | 1068| 106| |
|
| 6|BGRFC_WATCHDOG | 1069| 106| |
|
| 7|AUTOTH | 1423| 56| |
|
| 8|AUTOCCMS | 5342| 49| |
|
| 9|AUTOSECURITY | 5342| 49| |
|
| 10|LOAD_CALCULATION | 320171| 1| |
|
| 11|SPOOLALRM | 5343| 49| |
|
| 12|CALL_DELAYED | 0| 9065| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 71 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:43:30:539 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:43:50:539 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:44:10:539 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 16174

Tue Sep 17 23:44:18:118 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:44:30:540 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 16174 terminated

Tue Sep 17 23:44:50:541 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:45:10:542 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-16814
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-16815

Tue Sep 17 23:45:12:284 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16814) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16814) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16814)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16815) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16815) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16815)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:45:30:543 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
Tue Sep 17 23:45:50:543 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:46:10:543 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:46:30:544 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:46:50:545 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:47:10:546 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:47:30:546 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:47:50:547 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:48:10:547 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:48:30:547 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:48:50:548 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:44:10 2019, skip new
snapshot

Tue Sep 17 23:49:10:548 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 72 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:49:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Tue Sep 17 23:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10918051, readCount 10918051)


UPD : 0 (peak 31, writeCount 2311, readCount 2311)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080339, readCount 1080339)
SPO : 0 (peak 2, writeCount 11774, readCount 11774)
UP2 : 0 (peak 1, writeCount 1130, readCount 1130)
DISP: 0 (peak 67, writeCount 419675, readCount 419675)
GW : 0 (peak 45, writeCount 9964341, readCount 9964341)
ICM : 0 (peak 186, writeCount 198010, readCount 198010)
LWP : 2 (peak 15, writeCount 17908, readCount 17906)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:49:10 2019


------------------------------------------------------------

Current snapshot id: 72


DB clean time (in percent of total time) : 24.04 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |87 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |87 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Tue Sep 17 23:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:48:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 23:48:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:49:10 2019


------------------------------------------------------------
Current pipes in use: 199
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2673| 48| |
|
| 1|DDLOG | 2673| 48| |
|
| 2|BTCSCHED | 5348| 49| |
|
| 3|RESTART_ALL | 1069| 46| |
|
| 4|ENVCHECK | 16047| 20| |
|
| 5|AUTOABAP | 1069| 46| |
|
| 6|BGRFC_WATCHDOG | 1070| 46| |
|
| 7|AUTOTH | 1429| 56| |
|
| 8|AUTOCCMS | 5348| 49| |
|
| 9|AUTOSECURITY | 5348| 49| |
|
| 10|LOAD_CALCULATION | 320530| 1| |
|
| 11|SPOOLALRM | 5349| 49| |
|
| 12|CALL_DELAYED | 0| 8705| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 72 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:49:30:549 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:49:50:550 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:50:10:550 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-18136
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-18137
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 18138

Tue Sep 17 23:50:12:262 2019


*** ERROR => DpHdlDeadWp: W1 (pid 18136) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18136) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 18136)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 18137) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18137) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18137)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:50:17:587 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:50:30:550 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 18138 terminated

Tue Sep 17 23:50:50:551 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:51:10:551 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:51:30:551 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:51:50:552 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:52:10:552 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:52:30:552 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:52:50:552 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:53:10:553 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:53:30:553 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:53:50:554 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:54:10:554 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
Tue Sep 17 23:54:30:554 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:54:50:555 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:50:10 2019, skip new
snapshot

Tue Sep 17 23:55:10:556 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 73 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:55:10 2019) - begin **********

Server smprd02_SMP_00, Tue Sep 17 23:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Tue Sep 17 23:55:10 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10918884, readCount 10918884)


UPD : 0 (peak 31, writeCount 2312, readCount 2312)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080343, readCount 1080343)
SPO : 0 (peak 2, writeCount 11787, readCount 11787)
UP2 : 0 (peak 1, writeCount 1131, readCount 1131)
DISP: 0 (peak 67, writeCount 419720, readCount 419720)
GW : 0 (peak 45, writeCount 9964916, readCount 9964916)
ICM : 0 (peak 186, writeCount 198037, readCount 198037)
LWP : 2 (peak 15, writeCount 17923, readCount 17921)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Tue Sep 17 23:55:10 2019


------------------------------------------------------------

Current snapshot id: 73


DB clean time (in percent of total time) : 24.05 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |88 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |88 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16
Session Table Tue Sep 17 23:55:10 2019
------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |23:54:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Tue Sep 17 23:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Tue Sep 17 23:54:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Tue Sep 17 23:55:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Tue Sep 17 23:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2676| 48| |
|
| 1|DDLOG | 2676| 48| |
|
| 2|BTCSCHED | 5354| 49| |
|
| 3|RESTART_ALL | 1071| 286| |
|
| 4|ENVCHECK | 16065| 20| |
|
| 5|AUTOABAP | 1071| 286| |
|
| 6|BGRFC_WATCHDOG | 1072| 286| |
|
| 7|AUTOTH | 1435| 56| |
|
| 8|AUTOCCMS | 5354| 49| |
|
| 9|AUTOSECURITY | 5354| 49| |
|
| 10|LOAD_CALCULATION | 320888| 0| |
|
| 11|SPOOLALRM | 5355| 49| |
|
| 12|CALL_DELAYED | 0| 8345| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 73 (Reason: Workprocess 1 died / Time: Tue Sep 17


23:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-19911
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:55:10:562 2019


DpWpDynCreate: created new work process W12-19912

Tue Sep 17 23:55:12:275 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19911) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19911) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19911)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19912) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19912) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19912)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:55:30:557 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Tue Sep 17 23:55:50:557 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Tue Sep 17 23:56:10:557 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 20174

Tue Sep 17 23:56:17:786 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:56:30:558 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 20174 terminated

Tue Sep 17 23:56:50:558 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:57:10:558 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:57:30:558 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:57:50:559 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:58:10:559 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:58:30:560 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:58:50:560 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:59:10:561 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:59:30:562 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Tue Sep 17 23:59:50:563 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Wed Sep 18 00:00:10:563 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21794
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-21795

Wed Sep 18 00:00:12:318 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21794) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21794) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21794)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21795) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21795) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21795)
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Wed Sep 18 00:00:30:564 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Wed Sep 18 00:00:50:565 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Tue Sep 17 23:56:10 2019, skip new
snapshot

Wed Sep 18 00:01:10:565 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 74 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:01:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10919719, readCount 10919719)


UPD : 0 (peak 31, writeCount 2313, readCount 2313)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080347, readCount 1080347)
SPO : 0 (peak 2, writeCount 11800, readCount 11800)
UP2 : 0 (peak 1, writeCount 1132, readCount 1132)
DISP: 0 (peak 67, writeCount 419761, readCount 419761)
GW : 1 (peak 45, writeCount 9965486, readCount 9965485)
ICM : 0 (peak 186, writeCount 198064, readCount 198064)
LWP : 0 (peak 15, writeCount 17938, readCount 17938)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 292 (rq_id 29299763, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:01:10 2019


------------------------------------------------------------

Current snapshot id: 74


DB clean time (in percent of total time) : 24.05 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |90 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |90 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:00:58|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Wed Sep 18 00:00:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:01:10 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:01:10 2019


------------------------------------------------------------
|Handle |Type |Calls |Wait(sec) |Session |Resp-ID
|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2679| 48| |
|
| 1|DDLOG | 2679| 48| |
|
| 2|BTCSCHED | 5360| 49| |
|
| 3|RESTART_ALL | 1072| 226| |
|
| 4|ENVCHECK | 16083| 20| |
|
| 5|AUTOABAP | 1072| 226| |
|
| 6|BGRFC_WATCHDOG | 1073| 226| |
|
| 7|AUTOTH | 1441| 56| |
|
| 8|AUTOCCMS | 5360| 49| |
|
| 9|AUTOSECURITY | 5360| 49| |
|
| 10|LOAD_CALCULATION | 321247| 1| |
|
| 11|SPOOLALRM | 5361| 49| |
|
| 12|CALL_DELAYED | 0| 7985| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 74 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:01:30:566 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:01:50:567 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:02:10:567 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22581

Wed Sep 18 00:02:18:306 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:02:30:568 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22581 terminated

Wed Sep 18 00:02:50:574 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:03:10:574 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:03:30:574 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:03:50:575 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:04:10:575 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
Wed Sep 18 00:04:30:575 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:04:50:576 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:05:10:577 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-3485
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-3487

Wed Sep 18 00:05:12:492 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3485) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3485) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3485)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3487) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W12 (pid = 3487)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:05:30:578 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:05:50:578 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:06:10:579 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:06:30:580 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:06:50:580 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:02:10 2019, skip new
snapshot

Wed Sep 18 00:07:10:581 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 75 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:07:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10920797, readCount 10920797)


UPD : 0 (peak 31, writeCount 2314, readCount 2314)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080351, readCount 1080351)
SPO : 0 (peak 2, writeCount 11813, readCount 11813)
UP2 : 0 (peak 1, writeCount 1133, readCount 1133)
DISP: 0 (peak 67, writeCount 419801, readCount 419801)
GW : 0 (peak 45, writeCount 9966288, readCount 9966288)
ICM : 0 (peak 186, writeCount 198091, readCount 198091)
LWP : 0 (peak 15, writeCount 17953, readCount 17953)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:07:10 2019


------------------------------------------------------------

Current snapshot id: 75


DB clean time (in percent of total time) : 24.06 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |91 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |91 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16
Session Table Wed Sep 18 00:07:10 2019
------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:06:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Wed Sep 18 00:06:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:07:10 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2682| 48| |
|
| 1|DDLOG | 2682| 48| |
|
| 2|BTCSCHED | 5366| 49| |
|
| 3|RESTART_ALL | 1073| 166| |
|
| 4|ENVCHECK | 16101| 20| |
|
| 5|AUTOABAP | 1073| 166| |
|
| 6|BGRFC_WATCHDOG | 1074| 166| |
|
| 7|AUTOTH | 1447| 56| |
|
| 8|AUTOCCMS | 5366| 49| |
|
| 9|AUTOSECURITY | 5366| 49| |
|
| 10|LOAD_CALCULATION | 321605| 0| |
|
| 11|SPOOLALRM | 5367| 49| |
|
| 12|CALL_DELAYED | 0| 7625| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 75 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:07:30:582 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:07:50:582 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:08:10:582 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9817

Wed Sep 18 00:08:18:400 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
Wed Sep 18 00:08:30:582 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9817 terminated

Wed Sep 18 00:08:50:583 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:09:10:583 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:09:30:583 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:09:50:584 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:10:10:584 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-18206
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-18207

Wed Sep 18 00:10:12:220 2019


*** ERROR => DpHdlDeadWp: W1 (pid 18206) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18206) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 18206)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 18207) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18207) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18207)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:10:30:585 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:10:50:585 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:11:10:586 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:11:30:587 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:11:50:587 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:12:10:588 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:12:30:589 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:12:50:589 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:08:10 2019, skip new
snapshot

Wed Sep 18 00:13:10:590 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 76 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:13:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2


Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10921651, readCount 10921651)


UPD : 0 (peak 31, writeCount 2316, readCount 2316)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080359, readCount 1080359)
SPO : 0 (peak 2, writeCount 11827, readCount 11827)
UP2 : 0 (peak 1, writeCount 1135, readCount 1135)
DISP: 0 (peak 67, writeCount 419842, readCount 419842)
GW : 0 (peak 45, writeCount 9966900, readCount 9966900)
ICM : 1 (peak 186, writeCount 198120, readCount 198119)
LWP : 2 (peak 15, writeCount 17983, readCount 17981)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:13:10 2019


------------------------------------------------------------

Current snapshot id: 76


DB clean time (in percent of total time) : 24.07 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |92 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |92 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:13:10 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:12:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Wed Sep 18 00:12:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:13:10 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2685| 48| |
|
| 1|DDLOG | 2685| 48| |
|
| 2|BTCSCHED | 5372| 49| |
|
| 3|RESTART_ALL | 1074| 106| |
|
| 4|ENVCHECK | 16119| 20| |
|
| 5|AUTOABAP | 1074| 106| |
|
| 6|BGRFC_WATCHDOG | 1075| 106| |
|
| 7|AUTOTH | 1453| 56| |
|
| 8|AUTOCCMS | 5372| 49| |
|
| 9|AUTOSECURITY | 5372| 49| |
|
| 10|LOAD_CALCULATION | 321964| 1| |
|
| 11|SPOOLALRM | 5373| 49| |
|
| 12|CALL_DELAYED | 0| 7265| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 76 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:13:30:590 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:13:50:591 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:14:10:592 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 24308

Wed Sep 18 00:14:18:494 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:14:30:593 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 24308 terminated

Wed Sep 18 00:14:50:594 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:15:10:594 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-27388
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-27389

Wed Sep 18 00:15:12:275 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27388) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27388) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27388)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27389) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27389) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27389)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:15:30:595 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:15:50:595 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:16:10:596 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:16:30:597 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:16:50:597 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:17:10:598 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:17:30:599 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:17:50:599 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:18:10:600 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:18:30:601 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
Wed Sep 18 00:18:50:601 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:14:10 2019, skip new
snapshot

Wed Sep 18 00:19:10:602 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 77 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:19:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000


DIA : 0 (peak 291, writeCount 10922547, readCount 10922547)
UPD : 0 (peak 31, writeCount 2317, readCount 2317)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080363, readCount 1080363)
SPO : 0 (peak 2, writeCount 11840, readCount 11840)
UP2 : 0 (peak 1, writeCount 1136, readCount 1136)
DISP: 0 (peak 67, writeCount 419883, readCount 419883)
GW : 0 (peak 45, writeCount 9967528, readCount 9967528)
ICM : 0 (peak 186, writeCount 198147, readCount 198147)
LWP : 2 (peak 15, writeCount 17998, readCount 17996)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:19:10 2019


------------------------------------------------------------

Current snapshot id: 77


DB clean time (in percent of total time) : 24.08 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |93 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |93 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:18:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 00:18:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:19:10 2019


------------------------------------------------------------
Current pipes in use: 199
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2688| 48| |
|
| 1|DDLOG | 2688| 48| |
|
| 2|BTCSCHED | 5378| 49| |
|
| 3|RESTART_ALL | 1075| 46| |
|
| 4|ENVCHECK | 16137| 20| |
|
| 5|AUTOABAP | 1075| 46| |
|
| 6|BGRFC_WATCHDOG | 1076| 46| |
|
| 7|AUTOTH | 1459| 56| |
|
| 8|AUTOCCMS | 5378| 49| |
|
| 9|AUTOSECURITY | 5378| 49| |
|
| 10|LOAD_CALCULATION | 322323| 1| |
|
| 11|SPOOLALRM | 5379| 49| |
|
| 12|CALL_DELAYED | 0| 6905| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 77 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:19:30:602 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:19:50:603 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:20:10:603 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-6345
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-6346
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 6347

Wed Sep 18 00:20:12:266 2019


*** ERROR => DpHdlDeadWp: W1 (pid 6345) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6345) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 6345)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 6346) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6346) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 6346)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
Wed Sep 18 00:20:18:264 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:20:30:603 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 6347 terminated

Wed Sep 18 00:20:50:604 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:21:10:604 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:21:30:605 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:21:50:606 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:22:10:606 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:22:30:607 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:22:50:608 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:23:10:609 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:23:30:609 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:23:50:610 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:24:10:615 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:24:30:615 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:24:50:615 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:20:10 2019, skip new
snapshot

Wed Sep 18 00:25:10:616 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 78 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:25:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 4
Running requests[RQ_Q_PRIO_NORMAL] = 2
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10923369, readCount 10923369)


UPD : 0 (peak 31, writeCount 2318, readCount 2318)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080367, readCount 1080367)
SPO : 0 (peak 2, writeCount 11853, readCount 11853)
UP2 : 0 (peak 1, writeCount 1137, readCount 1137)
DISP: 0 (peak 67, writeCount 419928, readCount 419928)
GW : 0 (peak 45, writeCount 9968107, readCount 9968107)
ICM : 0 (peak 186, writeCount 198174, readCount 198174)
LWP : 2 (peak 15, writeCount 18013, readCount 18011)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:25:10 2019


------------------------------------------------------------

Current snapshot id: 78


DB clean time (in percent of total time) : 24.09 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |94 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 4|19804 |DIA |WP_RUN | | |norm|T49_U19509_M0 |HTTP_NORM| | |
0|<HANDLE PLUGIN> |000| | |
|
| 5|8468 |DIA |WP_RUN | | |norm|T14_U19510_M0 |HTTP_NORM| | |
0|<HANDLE PLUGIN> |000| | |
|
| 12| |BTC |WP_KILL| |94 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 4 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T14_U19510_M0 |000| |10.54.36.37 |00:25:10|5 |
SAPMHTTP |norm| |
| | 4590|
|HTTP_NORMAL |T49_U19509_M0 |000| |10.54.36.28 |00:25:10|4 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:24:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 5 logons with 5 sessions


Total ES (gross) memory of all sessions: 29 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 00:24:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:25:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2691| 48| |
|
| 1|DDLOG | 2691| 48| |
|
| 2|BTCSCHED | 5384| 49| |
|
| 3|RESTART_ALL | 1077| 286| |
|
| 4|ENVCHECK | 16155| 20| |
|
| 5|AUTOABAP | 1077| 286| |
|
| 6|BGRFC_WATCHDOG | 1078| 286| |
|
| 7|AUTOTH | 1465| 56| |
|
| 8|AUTOCCMS | 5384| 49| |
|
| 9|AUTOSECURITY | 5384| 49| |
|
| 10|LOAD_CALCULATION | 322681| 1| |
|
| 11|SPOOLALRM | 5385| 49| |
|
| 12|CALL_DELAYED | 0| 6545| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 78 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-16034
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:25:10:624 2019


DpWpDynCreate: created new work process W12-16035

Wed Sep 18 00:25:15:554 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16034) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16034) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16034)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16035) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16035) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16035)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:25:30:616 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:25:50:617 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:26:10:617 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 16324

Wed Sep 18 00:26:23:124 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:26:30:618 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 16324 terminated

Wed Sep 18 00:26:50:618 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:27:10:619 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:27:30:620 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:27:50:620 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:28:10:621 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:28:30:621 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:28:50:622 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:29:10:623 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:29:30:624 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:29:50:624 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:30:10:625 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-18031
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-18032

Wed Sep 18 00:30:13:322 2019


*** ERROR => DpHdlDeadWp: W1 (pid 18031) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18031) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 18031)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 18032) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18032) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18032)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:30:30:625 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:30:50:626 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:26:10 2019, skip new
snapshot

Wed Sep 18 00:31:10:627 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 79 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:31:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10924213, readCount 10924213)


UPD : 0 (peak 31, writeCount 2319, readCount 2319)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080371, readCount 1080371)
SPO : 0 (peak 2, writeCount 11866, readCount 11866)
UP2 : 0 (peak 1, writeCount 1138, readCount 1138)
DISP: 0 (peak 67, writeCount 419969, readCount 419969)
GW : 0 (peak 45, writeCount 9968669, readCount 9968669)
ICM : 0 (peak 186, writeCount 198201, readCount 198201)
LWP : 0 (peak 15, writeCount 18028, readCount 18028)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:31:10 2019


------------------------------------------------------------

Current snapshot id: 79


DB clean time (in percent of total time) : 24.10 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |96 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |96 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:30:58|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Wed Sep 18 00:30:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:31:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2694| 48| |
|
| 1|DDLOG | 2694| 48| |
|
| 2|BTCSCHED | 5390| 49| |
|
| 3|RESTART_ALL | 1078| 226| |
|
| 4|ENVCHECK | 16173| 20| |
|
| 5|AUTOABAP | 1078| 226| |
|
| 6|BGRFC_WATCHDOG | 1079| 226| |
|
| 7|AUTOTH | 1471| 56| |
|
| 8|AUTOCCMS | 5390| 49| |
|
| 9|AUTOSECURITY | 5390| 49| |
|
| 10|LOAD_CALCULATION | 323040| 1| |
|
| 11|SPOOLALRM | 5391| 49| |
|
| 12|CALL_DELAYED | 0| 6185| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 79 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:31:30:628 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:31:50:629 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:32:10:630 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 18620

Wed Sep 18 00:32:20:862 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:32:30:631 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 18620 terminated

Wed Sep 18 00:32:50:631 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:33:10:631 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:33:30:632 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:33:50:633 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:34:10:633 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:34:30:634 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:34:50:634 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:35:10:635 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-19724
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-19725

Wed Sep 18 00:35:13:171 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19724) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19724) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19724)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19725) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19725) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19725)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:35:30:636 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:35:50:636 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:36:10:637 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
Wed Sep 18 00:36:30:638 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:36:50:638 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:32:10 2019, skip new
snapshot

Wed Sep 18 00:37:10:638 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 80 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:37:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10925092, readCount 10925092)


UPD : 0 (peak 31, writeCount 2320, readCount 2320)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080375, readCount 1080375)
SPO : 0 (peak 2, writeCount 11879, readCount 11879)
UP2 : 0 (peak 1, writeCount 1139, readCount 1139)
DISP: 0 (peak 67, writeCount 420010, readCount 420010)
GW : 0 (peak 45, writeCount 9969259, readCount 9969259)
ICM : 0 (peak 186, writeCount 198228, readCount 198228)
LWP : 0 (peak 15, writeCount 18043, readCount 18043)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:37:10 2019


------------------------------------------------------------

Current snapshot id: 80


DB clean time (in percent of total time) : 24.11 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |97 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |97 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:37:10 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:36:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Wed Sep 18 00:36:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:37:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2697| 48| |
|
| 1|DDLOG | 2697| 48| |
|
| 2|BTCSCHED | 5396| 49| |
|
| 3|RESTART_ALL | 1079| 166| |
|
| 4|ENVCHECK | 16191| 20| |
|
| 5|AUTOABAP | 1079| 166| |
|
| 6|BGRFC_WATCHDOG | 1080| 166| |
|
| 7|AUTOTH | 1477| 56| |
|
| 8|AUTOCCMS | 5396| 49| |
|
| 9|AUTOSECURITY | 5396| 49| |
|
| 10|LOAD_CALCULATION | 323399| 1| |
|
| 11|SPOOLALRM | 5397| 49| |
|
| 12|CALL_DELAYED | 0| 5825| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 80 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:37:30:639 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:37:50:640 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:38:10:640 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 20744

Wed Sep 18 00:38:18:775 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:38:30:641 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 20744 terminated

Wed Sep 18 00:38:50:641 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:39:10:641 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:39:30:642 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:39:50:642 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:40:10:642 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21659
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-21660

Wed Sep 18 00:40:13:180 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21659) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21659) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21659)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21660) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21660) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21660)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:40:30:642 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:40:50:643 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:41:10:643 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:41:30:643 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:41:50:644 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:42:10:645 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:42:30:645 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
Wed Sep 18 00:42:50:646 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:38:10 2019, skip new
snapshot

Wed Sep 18 00:43:10:646 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 81 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:43:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000


DIA : 0 (peak 291, writeCount 10925906, readCount 10925906)
UPD : 0 (peak 31, writeCount 2322, readCount 2322)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080383, readCount 1080383)
SPO : 0 (peak 2, writeCount 11893, readCount 11893)
UP2 : 0 (peak 1, writeCount 1141, readCount 1141)
DISP: 0 (peak 67, writeCount 420051, readCount 420051)
GW : 0 (peak 45, writeCount 9969805, readCount 9969805)
ICM : 0 (peak 186, writeCount 198257, readCount 198257)
LWP : 2 (peak 15, writeCount 18073, readCount 18071)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:43:10 2019


------------------------------------------------------------

Current snapshot id: 81


DB clean time (in percent of total time) : 24.12 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |98 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_RUN | | |norm|T5_U20680_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |000| | |
|
| 12| |BTC |WP_KILL| |98 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T5_U20680_M0 |000| |10.54.36.37 |00:43:09|2 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:42:58|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Wed Sep 18 00:42:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:43:10 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2700| 48| |
|
| 1|DDLOG | 2700| 48| |
|
| 2|BTCSCHED | 5402| 49| |
|
| 3|RESTART_ALL | 1080| 106| |
|
| 4|ENVCHECK | 16209| 20| |
|
| 5|AUTOABAP | 1080| 106| |
|
| 6|BGRFC_WATCHDOG | 1081| 106| |
|
| 7|AUTOTH | 1483| 56| |
|
| 8|AUTOCCMS | 5402| 49| |
|
| 9|AUTOSECURITY | 5402| 49| |
|
| 10|LOAD_CALCULATION | 323758| 1| |
|
| 11|SPOOLALRM | 5403| 49| |
|
| 12|CALL_DELAYED | 0| 5465| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 81 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:43:30:646 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:43:50:647 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:44:10:648 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23701

Wed Sep 18 00:44:17:905 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:44:30:648 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 23701 terminated

Wed Sep 18 00:44:50:649 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:45:10:649 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-27078
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-27079

Wed Sep 18 00:45:12:368 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27078) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27078) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27078)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27079) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27079) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27079)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:45:30:650 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:45:50:651 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:46:10:651 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:46:30:652 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:46:50:653 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:47:10:653 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:47:30:654 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:47:50:654 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:48:10:655 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:48:30:655 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
Wed Sep 18 00:48:50:656 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:44:10 2019, skip new
snapshot

Wed Sep 18 00:49:10:657 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 82 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:49:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000


DIA : 0 (peak 291, writeCount 10926728, readCount 10926728)
UPD : 0 (peak 31, writeCount 2323, readCount 2323)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080387, readCount 1080387)
SPO : 0 (peak 2, writeCount 11906, readCount 11906)
UP2 : 0 (peak 1, writeCount 1142, readCount 1142)
DISP: 0 (peak 67, writeCount 420091, readCount 420091)
GW : 0 (peak 45, writeCount 9970357, readCount 9970357)
ICM : 0 (peak 186, writeCount 198284, readCount 198284)
LWP : 2 (peak 15, writeCount 18088, readCount 18086)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:49:10 2019


------------------------------------------------------------

Current snapshot id: 82


DB clean time (in percent of total time) : 24.13 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |99 |norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |99 |low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:48:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 00:48:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:49:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2703| 48| |
|
| 1|DDLOG | 2703| 48| |
|
| 2|BTCSCHED | 5408| 49| |
|
| 3|RESTART_ALL | 1081| 46| |
|
| 4|ENVCHECK | 16227| 20| |
|
| 5|AUTOABAP | 1081| 46| |
|
| 6|BGRFC_WATCHDOG | 1082| 46| |
|
| 7|AUTOTH | 1489| 56| |
|
| 8|AUTOCCMS | 5408| 49| |
|
| 9|AUTOSECURITY | 5408| 49| |
|
| 10|LOAD_CALCULATION | 324117| 1| |
|
| 11|SPOOLALRM | 5409| 49| |
|
| 12|CALL_DELAYED | 0| 5105| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 82 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:49:30:657 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:49:50:658 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:50:10:659 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-5239
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-5240
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 5241

Wed Sep 18 00:50:12:336 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5239) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5239) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5239)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5240) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5240) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5240)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
Wed Sep 18 00:50:17:436 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:50:30:659 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 5241 terminated

Wed Sep 18 00:50:50:660 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:51:10:660 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:51:30:661 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:51:50:661 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:52:10:662 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:52:30:663 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:52:50:664 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:53:10:664 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:53:30:665 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:53:50:666 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:54:10:666 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:54:30:667 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:54:50:668 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:50:10 2019, skip new
snapshot

Wed Sep 18 00:55:10:669 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 83 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:55:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 00:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 00:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10927506, readCount 10927506)


UPD : 0 (peak 31, writeCount 2324, readCount 2324)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080391, readCount 1080391)
SPO : 0 (peak 2, writeCount 11919, readCount 11919)
UP2 : 0 (peak 1, writeCount 1143, readCount 1143)
DISP: 0 (peak 67, writeCount 420136, readCount 420136)
GW : 0 (peak 45, writeCount 9970912, readCount 9970912)
ICM : 1 (peak 186, writeCount 198311, readCount 198310)
LWP : 2 (peak 15, writeCount 18103, readCount 18101)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 00:55:10 2019


------------------------------------------------------------

Current snapshot id: 83


DB clean time (in percent of total time) : 24.13 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |100|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |100|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 00:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |00:54:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 00:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 00:54:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 00:55:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 00:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2706| 48| |
|
| 1|DDLOG | 2706| 48| |
|
| 2|BTCSCHED | 5414| 49| |
|
| 3|RESTART_ALL | 1083| 286| |
|
| 4|ENVCHECK | 16245| 20| |
|
| 5|AUTOABAP | 1083| 286| |
|
| 6|BGRFC_WATCHDOG | 1084| 286| |
|
| 7|AUTOTH | 1495| 56| |
|
| 8|AUTOCCMS | 5414| 49| |
|
| 9|AUTOSECURITY | 5414| 49| |
|
| 10|LOAD_CALCULATION | 324475| 0| |
|
| 11|SPOOLALRM | 5415| 49| |
|
| 12|CALL_DELAYED | 0| 4745| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 83 (Reason: Workprocess 1 died / Time: Wed Sep 18


00:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-6996
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:55:10:675 2019


DpWpDynCreate: created new work process W12-6997

Wed Sep 18 00:55:12:392 2019


*** ERROR => DpHdlDeadWp: W1 (pid 6996) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6996) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 6996)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 6997) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6997) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 6997)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:55:30:669 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:55:50:670 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 00:56:10:670 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 7256

Wed Sep 18 00:56:16:835 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:56:30:671 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 7256 terminated

Wed Sep 18 00:56:50:671 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:57:10:672 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:57:30:672 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:57:50:673 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:58:10:673 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:58:30:674 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:58:50:675 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:59:10:675 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:59:30:675 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 00:59:50:676 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 01:00:10:677 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-8961
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-8962

Wed Sep 18 01:00:12:385 2019


*** ERROR => DpHdlDeadWp: W1 (pid 8961) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8961) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 8961)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 8962) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8962) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 8962)
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
Wed Sep 18 01:00:30:678 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 01:00:50:678 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 00:56:10 2019, skip new
snapshot

Wed Sep 18 01:01:10:678 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 84 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:01:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10928500, readCount 10928500)


UPD : 0 (peak 31, writeCount 2325, readCount 2325)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080395, readCount 1080395)
SPO : 0 (peak 2, writeCount 11932, readCount 11932)
UP2 : 0 (peak 1, writeCount 1144, readCount 1144)
DISP: 0 (peak 67, writeCount 420177, readCount 420177)
GW : 1 (peak 45, writeCount 9971622, readCount 9971621)
ICM : 0 (peak 186, writeCount 198338, readCount 198338)
LWP : 0 (peak 15, writeCount 18118, readCount 18118)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:01:10 2019


------------------------------------------------------------

Current snapshot id: 84


DB clean time (in percent of total time) : 24.14 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |102|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |102|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:00:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 01:00:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:01:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2709| 48| |
|
| 1|DDLOG | 2709| 48| |
|
| 2|BTCSCHED | 5420| 49| |
|
| 3|RESTART_ALL | 1084| 226| |
|
| 4|ENVCHECK | 16263| 20| |
|
| 5|AUTOABAP | 1084| 226| |
|
| 6|BGRFC_WATCHDOG | 1085| 226| |
|
| 7|AUTOTH | 1501| 56| |
|
| 8|AUTOCCMS | 5420| 49| |
|
| 9|AUTOSECURITY | 5420| 49| |
|
| 10|LOAD_CALCULATION | 324834| 1| |
|
| 11|SPOOLALRM | 5421| 49| |
|
| 12|CALL_DELAYED | 0| 4385| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 84 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:01:30:679 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:01:50:679 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:02:10:680 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 12522

Wed Sep 18 01:02:17:222 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:02:30:680 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 12522 terminated

Wed Sep 18 01:02:50:681 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:03:10:681 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:03:30:682 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:03:50:682 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:04:10:682 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:04:30:683 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:04:50:683 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:05:10:683 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-24512
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-24513

Wed Sep 18 01:05:12:096 2019


*** ERROR => DpHdlDeadWp: W1 (pid 24512) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24512) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 24512)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 24513) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24513) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 24513)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:05:30:684 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:05:50:684 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:06:10:685 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:06:30:686 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:06:50:686 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:02:10 2019, skip new
snapshot

Wed Sep 18 01:07:10:687 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 85 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:07:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2


Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10929610, readCount 10929610)


UPD : 0 (peak 31, writeCount 2326, readCount 2326)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080399, readCount 1080399)
SPO : 0 (peak 2, writeCount 11945, readCount 11945)
UP2 : 0 (peak 1, writeCount 1145, readCount 1145)
DISP: 0 (peak 67, writeCount 420218, readCount 420218)
GW : 0 (peak 45, writeCount 9972448, readCount 9972448)
ICM : 0 (peak 186, writeCount 198365, readCount 198365)
LWP : 0 (peak 15, writeCount 18133, readCount 18133)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:07:10 2019


------------------------------------------------------------

Current snapshot id: 85


DB clean time (in percent of total time) : 24.15 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |103|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |103|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:06:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 01:06:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:07:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2712| 48| |
|
| 1|DDLOG | 2712| 48| |
|
| 2|BTCSCHED | 5426| 49| |
|
| 3|RESTART_ALL | 1085| 166| |
|
| 4|ENVCHECK | 16281| 20| |
|
| 5|AUTOABAP | 1085| 166| |
|
| 6|BGRFC_WATCHDOG | 1086| 166| |
|
| 7|AUTOTH | 1507| 56| |
|
| 8|AUTOCCMS | 5426| 49| |
|
| 9|AUTOSECURITY | 5426| 49| |
|
| 10|LOAD_CALCULATION | 325192| 1| |
|
| 11|SPOOLALRM | 5427| 49| |
|
| 12|CALL_DELAYED | 0| 4025| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 85 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:07:30:687 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:07:50:688 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:08:10:689 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 4545

Wed Sep 18 01:08:17:336 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:08:30:689 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 4545 terminated

Wed Sep 18 01:08:50:690 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:09:10:690 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:09:30:691 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:09:50:692 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:10:10:693 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-7942
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-7943

Wed Sep 18 01:10:12:503 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7942) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7942) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7942)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7943) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7943) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7943)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:10:30:693 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:10:50:694 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:11:10:695 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:11:30:695 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:11:50:696 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:12:10:697 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:12:30:697 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot

Wed Sep 18 01:12:50:698 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:08:10 2019, skip new
snapshot
Wed Sep 18 01:13:10:698 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 86 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:13:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10930471, readCount 10930471)


UPD : 0 (peak 31, writeCount 2328, readCount 2328)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080407, readCount 1080407)
SPO : 0 (peak 2, writeCount 11959, readCount 11959)
UP2 : 0 (peak 1, writeCount 1147, readCount 1147)
DISP: 0 (peak 67, writeCount 420259, readCount 420259)
GW : 0 (peak 45, writeCount 9973048, readCount 9973048)
ICM : 0 (peak 186, writeCount 198394, readCount 198394)
LWP : 2 (peak 15, writeCount 18163, readCount 18161)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:13:10 2019


------------------------------------------------------------

Current snapshot id: 86


DB clean time (in percent of total time) : 24.16 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |104|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |104|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:12:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Wed Sep 18 01:12:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:13:10 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2715| 48| |
|
| 1|DDLOG | 2715| 48| |
|
| 2|BTCSCHED | 5432| 49| |
|
| 3|RESTART_ALL | 1086| 106| |
|
| 4|ENVCHECK | 16299| 20| |
|
| 5|AUTOABAP | 1086| 106| |
|
| 6|BGRFC_WATCHDOG | 1087| 106| |
|
| 7|AUTOTH | 1513| 56| |
|
| 8|AUTOCCMS | 5432| 49| |
|
| 9|AUTOSECURITY | 5432| 49| |
|
| 10|LOAD_CALCULATION | 325552| 1| |
|
| 11|SPOOLALRM | 5433| 49| |
|
| 12|CALL_DELAYED | 0| 3665| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 86 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:13:30:699 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:13:50:700 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:14:10:700 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9028

Wed Sep 18 01:14:17:412 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:14:30:701 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9028 terminated

Wed Sep 18 01:14:50:701 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:15:10:701 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-9670
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-9671

Wed Sep 18 01:15:12:188 2019


*** ERROR => DpHdlDeadWp: W1 (pid 9670) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9670) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 9670)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 9671) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9671) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 9671)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:15:30:702 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:15:50:702 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:16:10:703 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:16:30:703 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:16:50:704 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:17:10:705 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:17:30:705 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:17:50:706 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:18:10:706 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:18:30:706 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:18:50:706 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:14:10 2019, skip new
snapshot

Wed Sep 18 01:19:10:707 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 87 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:19:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10931338, readCount 10931338)


UPD : 0 (peak 31, writeCount 2329, readCount 2329)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080411, readCount 1080411)
SPO : 0 (peak 2, writeCount 11972, readCount 11972)
UP2 : 0 (peak 1, writeCount 1148, readCount 1148)
DISP: 0 (peak 67, writeCount 420300, readCount 420300)
GW : 1 (peak 45, writeCount 9973648, readCount 9973647)
ICM : 1 (peak 186, writeCount 198421, readCount 198420)
LWP : 2 (peak 15, writeCount 18178, readCount 18176)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:19:10 2019


------------------------------------------------------------

Current snapshot id: 87


DB clean time (in percent of total time) : 24.16 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |105|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |105|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:18:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 01:18:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:19:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2718| 48| |
|
| 1|DDLOG | 2718| 48| |
|
| 2|BTCSCHED | 5438| 49| |
|
| 3|RESTART_ALL | 1087| 46| |
|
| 4|ENVCHECK | 16317| 20| |
|
| 5|AUTOABAP | 1087| 46| |
|
| 6|BGRFC_WATCHDOG | 1088| 46| |
|
| 7|AUTOTH | 1519| 56| |
|
| 8|AUTOCCMS | 5438| 49| |
|
| 9|AUTOSECURITY | 5438| 49| |
|
| 10|LOAD_CALCULATION | 325910| 1| |
|
| 11|SPOOLALRM | 5439| 49| |
|
| 12|CALL_DELAYED | 0| 3305| |
|
Found 13 periodic tasks

********** SERVER SNAPSHOT 87 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:19:30:707 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:19:50:708 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:20:10:709 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-11363
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-11364
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 11365

Wed Sep 18 01:20:12:160 2019


*** ERROR => DpHdlDeadWp: W1 (pid 11363) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11363) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 11363)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 11364) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11364) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 11364)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:20:17:188 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:20:30:709 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 11365 terminated

Wed Sep 18 01:20:50:709 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:21:10:710 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:21:30:711 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:21:50:711 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:22:10:711 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:22:30:712 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:22:50:713 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:23:10:713 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:23:30:713 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:23:50:714 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:24:10:714 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:24:30:715 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:24:50:716 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:20:10 2019, skip new
snapshot

Wed Sep 18 01:25:10:717 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 88 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:25:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:25:10 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10932162, readCount 10932162)


UPD : 0 (peak 31, writeCount 2330, readCount 2330)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080415, readCount 1080415)
SPO : 0 (peak 2, writeCount 11985, readCount 11985)
UP2 : 0 (peak 1, writeCount 1149, readCount 1149)
DISP: 0 (peak 67, writeCount 420345, readCount 420345)
GW : 0 (peak 45, writeCount 9974215, readCount 9974215)
ICM : 0 (peak 186, writeCount 198448, readCount 198448)
LWP : 2 (peak 15, writeCount 18193, readCount 18191)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:25:10 2019


------------------------------------------------------------

Current snapshot id: 88


DB clean time (in percent of total time) : 24.17 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |106|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 4|19804 |DIA |WP_RUN | | |norm|T5_U23617_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |000| | |
|
| 12| |BTC |WP_KILL| |106|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T5_U23617_M0 |000| |10.54.36.37 |01:25:09|4 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:24:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
Found 4 logons with 4 sessions
Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 01:24:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:25:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2721| 48| |
|
| 1|DDLOG | 2721| 48| |
|
| 2|BTCSCHED | 5444| 49| |
|
| 3|RESTART_ALL | 1089| 286| |
|
| 4|ENVCHECK | 16335| 20| |
|
| 5|AUTOABAP | 1089| 286| |
|
| 6|BGRFC_WATCHDOG | 1090| 286| |
|
| 7|AUTOTH | 1525| 56| |
|
| 8|AUTOCCMS | 5444| 49| |
|
| 9|AUTOSECURITY | 5444| 49| |
|
| 10|LOAD_CALCULATION | 326269| 1| |
|
| 11|SPOOLALRM | 5445| 49| |
|
| 12|CALL_DELAYED | 0| 2945| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 88 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-13049
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:25:10:723 2019


DpWpDynCreate: created new work process W12-13050

Wed Sep 18 01:25:12:141 2019


*** ERROR => DpHdlDeadWp: W1 (pid 13049) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13049) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 13049)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 13050) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13050) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13050)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:25:30:718 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:25:50:718 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:26:10:718 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 13366

Wed Sep 18 01:26:17:108 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:26:30:718 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 13366 terminated

Wed Sep 18 01:26:50:718 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:27:10:719 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:27:30:720 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:27:50:721 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:28:10:721 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:28:30:721 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:28:50:723 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:29:10:723 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:29:30:724 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:29:50:725 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:30:10:725 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-14867
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-14868

Wed Sep 18 01:30:12:433 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14867) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14867) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14867)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14868) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14868) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14868)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:30:30:726 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
Wed Sep 18 01:30:50:727 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:26:10 2019, skip new
snapshot

Wed Sep 18 01:31:10:728 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 89 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:31:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000


DIA : 0 (peak 291, writeCount 10932970, readCount 10932970)
UPD : 0 (peak 31, writeCount 2331, readCount 2331)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080419, readCount 1080419)
SPO : 0 (peak 2, writeCount 11998, readCount 11998)
UP2 : 0 (peak 1, writeCount 1150, readCount 1150)
DISP: 0 (peak 67, writeCount 420385, readCount 420385)
GW : 0 (peak 45, writeCount 9974761, readCount 9974761)
ICM : 0 (peak 186, writeCount 198477, readCount 198477)
LWP : 0 (peak 15, writeCount 18208, readCount 18208)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:31:10 2019


------------------------------------------------------------

Current snapshot id: 89


DB clean time (in percent of total time) : 24.18 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |108|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |108|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:30:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 01:30:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:31:10 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2724| 48| |
|
| 1|DDLOG | 2724| 48| |
|
| 2|BTCSCHED | 5450| 49| |
|
| 3|RESTART_ALL | 1090| 226| |
|
| 4|ENVCHECK | 16353| 20| |
|
| 5|AUTOABAP | 1090| 226| |
|
| 6|BGRFC_WATCHDOG | 1091| 226| |
|
| 7|AUTOTH | 1531| 56| |
|
| 8|AUTOCCMS | 5450| 49| |
|
| 9|AUTOSECURITY | 5450| 49| |
|
| 10|LOAD_CALCULATION | 326627| 1| |
|
| 11|SPOOLALRM | 5451| 49| |
|
| 12|CALL_DELAYED | 0| 2585| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 89 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:31:30:728 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:31:50:729 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:32:10:729 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 15389

Wed Sep 18 01:32:17:319 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:32:30:730 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 15389 terminated

Wed Sep 18 01:32:50:730 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:33:10:730 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:33:30:731 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:33:50:732 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:34:10:732 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:34:30:733 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:34:50:733 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:35:10:734 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-16529
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-16530

Wed Sep 18 01:35:12:483 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16529) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16529) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16529)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16530) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16530) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16530)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:35:30:734 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:35:50:735 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:36:10:736 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:36:30:737 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:36:50:737 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:32:10 2019, skip new
snapshot

Wed Sep 18 01:37:10:737 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 90 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:37:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10933820, readCount 10933820)


UPD : 0 (peak 31, writeCount 2332, readCount 2332)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080423, readCount 1080423)
SPO : 0 (peak 2, writeCount 12011, readCount 12011)
UP2 : 0 (peak 1, writeCount 1151, readCount 1151)
DISP: 0 (peak 67, writeCount 420426, readCount 420426)
GW : 0 (peak 45, writeCount 9975341, readCount 9975341)
ICM : 0 (peak 186, writeCount 198504, readCount 198504)
LWP : 0 (peak 15, writeCount 18223, readCount 18223)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:37:10 2019


------------------------------------------------------------

Current snapshot id: 90


DB clean time (in percent of total time) : 24.19 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |109|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |109|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:36:58|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Wed Sep 18 01:36:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:37:10 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2727| 48| |
|
| 1|DDLOG | 2727| 48| |
|
| 2|BTCSCHED | 5456| 49| |
|
| 3|RESTART_ALL | 1091| 166| |
|
| 4|ENVCHECK | 16371| 20| |
|
| 5|AUTOABAP | 1091| 166| |
|
| 6|BGRFC_WATCHDOG | 1092| 166| |
|
| 7|AUTOTH | 1537| 56| |
|
| 8|AUTOCCMS | 5456| 49| |
|
| 9|AUTOSECURITY | 5456| 49| |
|
| 10|LOAD_CALCULATION | 326986| 1| |
|
| 11|SPOOLALRM | 5457| 49| |
|
| 12|CALL_DELAYED | 0| 2225| |
|
Found 13 periodic tasks

********** SERVER SNAPSHOT 90 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:37:30:738 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:37:50:738 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:38:10:739 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 17433

Wed Sep 18 01:38:17:429 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:38:30:740 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 17433 terminated

Wed Sep 18 01:38:50:740 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:39:10:741 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:39:30:741 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:39:50:742 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:40:10:742 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-18298
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-18299

Wed Sep 18 01:40:12:239 2019


*** ERROR => DpHdlDeadWp: W1 (pid 18298) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18298) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 18298)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 18299) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18299) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18299)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:40:30:743 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:40:50:743 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
Wed Sep 18 01:41:10:743 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:41:30:744 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:41:50:744 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:42:10:745 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:42:30:745 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:42:50:746 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:38:10 2019, skip new
snapshot

Wed Sep 18 01:43:10:747 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 91 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:43:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:43:10 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10934637, readCount 10934637)


UPD : 0 (peak 31, writeCount 2334, readCount 2334)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080431, readCount 1080431)
SPO : 0 (peak 2, writeCount 12025, readCount 12025)
UP2 : 0 (peak 1, writeCount 1153, readCount 1153)
DISP: 0 (peak 67, writeCount 420467, readCount 420467)
GW : 0 (peak 45, writeCount 9975893, readCount 9975893)
ICM : 0 (peak 186, writeCount 198533, readCount 198533)
LWP : 2 (peak 15, writeCount 18253, readCount 18251)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:43:10 2019


------------------------------------------------------------

Current snapshot id: 91


DB clean time (in percent of total time) : 24.20 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |110|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |110|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:42:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:43:10 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Wed Sep 18 01:42:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:43:10 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2730| 48| |
|
| 1|DDLOG | 2730| 48| |
|
| 2|BTCSCHED | 5462| 49| |
|
| 3|RESTART_ALL | 1092| 106| |
|
| 4|ENVCHECK | 16389| 20| |
|
| 5|AUTOABAP | 1092| 106| |
|
| 6|BGRFC_WATCHDOG | 1093| 106| |
|
| 7|AUTOTH | 1543| 56| |
|
| 8|AUTOCCMS | 5462| 49| |
|
| 9|AUTOSECURITY | 5462| 49| |
|
| 10|LOAD_CALCULATION | 327344| 1| |
|
| 11|SPOOLALRM | 5463| 49| |
|
| 12|CALL_DELAYED | 0| 1865| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 91 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:43:10 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:43:30:747 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:43:50:748 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:44:10:748 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 19352

Wed Sep 18 01:44:17:352 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:44:30:748 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 19352 terminated

Wed Sep 18 01:44:50:748 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:45:10:749 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-20082
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-20083
Wed Sep 18 01:45:12:455 2019
*** ERROR => DpHdlDeadWp: W1 (pid 20082) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20082) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 20082)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 20083) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20083) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 20083)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:45:30:750 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:45:50:750 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:46:10:751 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:46:30:752 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:46:50:752 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:47:10:753 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:47:30:753 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:47:50:754 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:48:10:754 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:48:30:754 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:48:50:755 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:44:10 2019, skip new
snapshot

Wed Sep 18 01:49:10:756 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 92 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:49:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10935488, readCount 10935488)


UPD : 0 (peak 31, writeCount 2335, readCount 2335)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080435, readCount 1080435)
SPO : 0 (peak 2, writeCount 12038, readCount 12038)
UP2 : 0 (peak 1, writeCount 1154, readCount 1154)
DISP: 0 (peak 67, writeCount 420508, readCount 420508)
GW : 0 (peak 45, writeCount 9976493, readCount 9976493)
ICM : 1 (peak 186, writeCount 198560, readCount 198559)
LWP : 2 (peak 15, writeCount 18268, readCount 18266)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:49:10 2019


------------------------------------------------------------

Current snapshot id: 92


DB clean time (in percent of total time) : 24.20 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |111|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |111|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:48:58|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Wed Sep 18 01:48:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:49:10 2019


------------------------------------------------------------
Current pipes in use: 199
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2733| 48| |
|
| 1|DDLOG | 2733| 48| |
|
| 2|BTCSCHED | 5468| 50| |
|
| 3|RESTART_ALL | 1093| 46| |
|
| 4|ENVCHECK | 16407| 20| |
|
| 5|AUTOABAP | 1093| 46| |
|
| 6|BGRFC_WATCHDOG | 1094| 46| |
|
| 7|AUTOTH | 1549| 56| |
|
| 8|AUTOCCMS | 5468| 50| |
|
| 9|AUTOSECURITY | 5468| 50| |
|
| 10|LOAD_CALCULATION | 327703| 1| |
|
| 11|SPOOLALRM | 5469| 50| |
|
| 12|CALL_DELAYED | 0| 1505| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 92 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:49:30:756 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:49:50:756 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:50:10:757 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-21577
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-21578
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 21579

Wed Sep 18 01:50:12:448 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21577) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21577) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21577)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21578) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21578) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21578)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:50:17:586 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:50:30:758 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 21579 terminated

Wed Sep 18 01:50:50:758 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:51:10:758 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:51:30:759 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:51:50:759 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:52:10:760 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:52:30:761 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:52:50:761 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:53:10:761 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
Wed Sep 18 01:53:30:762 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:53:50:762 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:54:10:763 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:54:30:763 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:54:50:764 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:50:10 2019, skip new
snapshot

Wed Sep 18 01:55:10:765 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 93 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:55:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 01:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 01:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10936317, readCount 10936317)


UPD : 0 (peak 31, writeCount 2336, readCount 2336)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080439, readCount 1080439)
SPO : 0 (peak 2, writeCount 12051, readCount 12051)
UP2 : 0 (peak 1, writeCount 1155, readCount 1155)
DISP: 0 (peak 67, writeCount 420553, readCount 420553)
GW : 0 (peak 45, writeCount 9977052, readCount 9977052)
ICM : 1 (peak 186, writeCount 198587, readCount 198586)
LWP : 2 (peak 15, writeCount 18283, readCount 18281)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 01:55:10 2019


------------------------------------------------------------

Current snapshot id: 93


DB clean time (in percent of total time) : 24.21 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |112|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |112|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 01:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |01:54:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 01:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 01:54:58 2019 |
Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 01:55:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 01:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2736| 48| |
|
| 1|DDLOG | 2736| 48| |
|
| 2|BTCSCHED | 5474| 50| |
|
| 3|RESTART_ALL | 1095| 286| |
|
| 4|ENVCHECK | 16425| 20| |
|
| 5|AUTOABAP | 1095| 286| |
|
| 6|BGRFC_WATCHDOG | 1096| 286| |
|
| 7|AUTOTH | 1555| 56| |
|
| 8|AUTOCCMS | 5474| 50| |
|
| 9|AUTOSECURITY | 5474| 50| |
|
| 10|LOAD_CALCULATION | 328061| 0| |
|
| 11|SPOOLALRM | 5475| 50| |
|
| 12|CALL_DELAYED | 0| 1145| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 93 (Reason: Workprocess 1 died / Time: Wed Sep 18


01:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-23340
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Wed Sep 18 01:55:10:771 2019
DpWpDynCreate: created new work process W12-23342

Wed Sep 18 01:55:12:514 2019


*** ERROR => DpHdlDeadWp: W1 (pid 23340) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23340) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 23340)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 23342) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23342) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 23342)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:55:30:765 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:55:50:766 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 01:56:10:766 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23753

Wed Sep 18 01:56:17:805 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:56:30:767 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 23753 terminated

Wed Sep 18 01:56:50:768 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
Wed Sep 18 01:57:10:768 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:57:30:768 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:57:50:768 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:58:10:768 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:58:30:769 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:58:50:770 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:59:10:770 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:59:30:770 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 01:59:50:770 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 02:00:10:771 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-25516
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-25517

Wed Sep 18 02:00:12:694 2019


*** ERROR => DpHdlDeadWp: W1 (pid 25516) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25516) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25516)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 25517) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25517) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 25517)
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 02:00:30:772 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 02:00:50:773 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 01:56:10 2019, skip new
snapshot

Wed Sep 18 02:01:10:773 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 94 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:01:10 2019) - begin **********
Server smprd02_SMP_00, Wed Sep 18 02:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10937139, readCount 10937139)


UPD : 0 (peak 31, writeCount 2337, readCount 2337)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080443, readCount 1080443)
SPO : 0 (peak 2, writeCount 12064, readCount 12064)
UP2 : 0 (peak 1, writeCount 1156, readCount 1156)
DISP: 0 (peak 67, writeCount 420594, readCount 420594)
GW : 0 (peak 45, writeCount 9977622, readCount 9977622)
ICM : 0 (peak 186, writeCount 198614, readCount 198614)
LWP : 0 (peak 15, writeCount 18298, readCount 18298)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:01:10 2019


------------------------------------------------------------

Current snapshot id: 94


DB clean time (in percent of total time) : 24.22 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |114|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |114|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:00:58|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:01:10 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Wed Sep 18 02:00:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:01:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2739| 48| |
|
| 1|DDLOG | 2739| 48| |
|
| 2|BTCSCHED | 5480| 50| |
|
| 3|RESTART_ALL | 1096| 226| |
|
| 4|ENVCHECK | 16443| 20| |
|
| 5|AUTOABAP | 1096| 226| |
|
| 6|BGRFC_WATCHDOG | 1097| 226| |
|
| 7|AUTOTH | 1561| 56| |
|
| 8|AUTOCCMS | 5480| 50| |
|
| 9|AUTOSECURITY | 5480| 50| |
|
| 10|LOAD_CALCULATION | 328420| 1| |
|
| 11|SPOOLALRM | 5481| 50| |
|
| 12|CALL_DELAYED | 0| 785| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 94 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:01:30:773 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:01:50:774 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:02:10:774 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 28715

Wed Sep 18 02:02:17:733 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:02:30:774 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 28715 terminated

Wed Sep 18 02:02:50:775 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:03:10:776 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
Wed Sep 18 02:03:30:776 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:03:50:777 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:04:10:778 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:04:30:778 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:04:50:779 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:05:10:779 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-7536
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-7537

Wed Sep 18 02:05:12:549 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7536) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7536) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7536)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7537) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7537) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7537)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:05:30:780 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:05:50:781 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:06:10:781 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:06:30:782 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:06:50:782 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:02:10 2019, skip new
snapshot

Wed Sep 18 02:07:10:783 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 95 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:07:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10938208, readCount 10938208)


UPD : 0 (peak 31, writeCount 2338, readCount 2338)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080447, readCount 1080447)
SPO : 0 (peak 2, writeCount 12077, readCount 12077)
UP2 : 0 (peak 1, writeCount 1157, readCount 1157)
DISP: 0 (peak 67, writeCount 420635, readCount 420635)
GW : 0 (peak 45, writeCount 9978406, readCount 9978406)
ICM : 0 (peak 186, writeCount 198641, readCount 198641)
LWP : 0 (peak 15, writeCount 18313, readCount 18313)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:07:10 2019


------------------------------------------------------------

Current snapshot id: 95


DB clean time (in percent of total time) : 24.23 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |115|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |115|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:06:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 02:06:58 2019 |

Found 1 RFC-Connections
CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:07:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2742| 48| |
|
| 1|DDLOG | 2742| 48| |
|
| 2|BTCSCHED | 5486| 50| |
|
| 3|RESTART_ALL | 1097| 166| |
|
| 4|ENVCHECK | 16461| 20| |
|
| 5|AUTOABAP | 1097| 166| |
|
| 6|BGRFC_WATCHDOG | 1098| 166| |
|
| 7|AUTOTH | 1567| 56| |
|
| 8|AUTOCCMS | 5486| 50| |
|
| 9|AUTOSECURITY | 5486| 50| |
|
| 10|LOAD_CALCULATION | 328778| 1| |
|
| 11|SPOOLALRM | 5487| 50| |
|
| 12|CALL_DELAYED | 0| 425| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 95 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:07:30:783 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:07:50:784 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:08:10:784 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 16040

Wed Sep 18 02:08:17:738 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:08:30:785 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 16040 terminated

Wed Sep 18 02:08:50:786 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:09:10:786 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:09:30:786 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:09:50:787 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:10:10:787 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-24335
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-24336

Wed Sep 18 02:10:12:496 2019


*** ERROR => DpHdlDeadWp: W1 (pid 24335) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24335) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 24335)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 24336) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W12 (pid = 24336)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:10:30:788 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:10:50:788 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:11:10:789 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:11:30:790 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:11:50:790 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:12:10:791 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:12:30:792 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:12:50:792 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:08:10 2019, skip new
snapshot

Wed Sep 18 02:13:10:793 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 96 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:13:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10939072, readCount 10939072)


UPD : 0 (peak 31, writeCount 2340, readCount 2340)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080455, readCount 1080455)
SPO : 0 (peak 2, writeCount 12091, readCount 12091)
UP2 : 0 (peak 1, writeCount 1159, readCount 1159)
DISP: 0 (peak 67, writeCount 420676, readCount 420676)
GW : 0 (peak 45, writeCount 9979006, readCount 9979006)
ICM : 0 (peak 186, writeCount 198670, readCount 198670)
LWP : 2 (peak 15, writeCount 18343, readCount 18341)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:13:10 2019


------------------------------------------------------------

Current snapshot id: 96


DB clean time (in percent of total time) : 24.23 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |116|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |116|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:12:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 02:12:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Wed Sep 18 02:13:10 2019
------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2745| 48| |
|
| 1|DDLOG | 2745| 48| |
|
| 2|BTCSCHED | 5492| 50| |
|
| 3|RESTART_ALL | 1098| 106| |
|
| 4|ENVCHECK | 16479| 20| |
|
| 5|AUTOABAP | 1098| 106| |
|
| 6|BGRFC_WATCHDOG | 1099| 106| |
|
| 7|AUTOTH | 1573| 56| |
|
| 8|AUTOCCMS | 5492| 50| |
|
| 9|AUTOSECURITY | 5492| 50| |
|
| 10|LOAD_CALCULATION | 329137| 1| |
|
| 11|SPOOLALRM | 5493| 50| |
|
| 12|CALL_DELAYED | 0| 65| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 96 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:13:30:793 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:13:50:794 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:14:10:795 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 25550

Wed Sep 18 02:14:17:792 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:14:30:795 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 25550 terminated

Wed Sep 18 02:14:50:796 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:15:10:796 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-26382
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-26383

Wed Sep 18 02:15:12:514 2019


*** ERROR => DpHdlDeadWp: W1 (pid 26382) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26382) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 26382)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 26383) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W12 (pid = 26383)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
Wed Sep 18 02:15:30:797 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:15:50:797 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:16:10:797 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:16:30:798 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:16:50:799 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:17:10:800 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:17:30:800 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:17:50:801 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:18:10:801 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:18:30:802 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:18:50:803 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:14:10 2019, skip new
snapshot

Wed Sep 18 02:19:10:803 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 97 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:19:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10939938, readCount 10939938)


UPD : 0 (peak 31, writeCount 2341, readCount 2341)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080459, readCount 1080459)
SPO : 0 (peak 2, writeCount 12104, readCount 12104)
UP2 : 0 (peak 1, writeCount 1160, readCount 1160)
DISP: 0 (peak 67, writeCount 420718, readCount 420718)
GW : 0 (peak 45, writeCount 9979606, readCount 9979606)
ICM : 1 (peak 186, writeCount 198697, readCount 198696)
LWP : 2 (peak 15, writeCount 18358, readCount 18356)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:19:10 2019


------------------------------------------------------------

Current snapshot id: 97


DB clean time (in percent of total time) : 24.24 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |117|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |117|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:18:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 02:18:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:19:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884
Periodic Tasks Wed Sep 18 02:19:10 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2748| 48| |
|
| 1|DDLOG | 2748| 48| |
|
| 2|BTCSCHED | 5498| 50| |
|
| 3|RESTART_ALL | 1099| 46| |
|
| 4|ENVCHECK | 16497| 20| |
|
| 5|AUTOABAP | 1099| 46| |
|
| 6|BGRFC_WATCHDOG | 1100| 46| |
|
| 7|AUTOTH | 1579| 56| |
|
| 8|AUTOCCMS | 5498| 50| |
|
| 9|AUTOSECURITY | 5498| 50| |
|
| 10|LOAD_CALCULATION | 329496| 1| |
|
| 11|SPOOLALRM | 5499| 50| |
|
| 12|CALL_DELAYED | 0| 18144| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 97 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:19:30:804 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:19:50:804 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:20:10:804 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-27817
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-27818
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 27819

Wed Sep 18 02:20:12:497 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27817) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27817) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27817)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27818) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27818) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27818)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:20:18:032 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:20:30:805 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 27819 terminated

Wed Sep 18 02:20:50:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:21:10:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:21:30:807 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:21:50:808 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:22:10:808 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:22:30:809 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:22:50:809 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:23:10:810 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:23:30:811 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:23:50:812 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
Wed Sep 18 02:24:10:812 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:24:30:812 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:24:50:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:20:10 2019, skip new
snapshot

Wed Sep 18 02:25:10:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 98 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:25:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10940761, readCount 10940761)


UPD : 0 (peak 31, writeCount 2342, readCount 2342)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080463, readCount 1080463)
SPO : 0 (peak 2, writeCount 12117, readCount 12117)
UP2 : 0 (peak 1, writeCount 1161, readCount 1161)
DISP: 0 (peak 67, writeCount 420763, readCount 420763)
GW : 0 (peak 45, writeCount 9980191, readCount 9980191)
ICM : 0 (peak 186, writeCount 198726, readCount 198726)
LWP : 2 (peak 15, writeCount 18373, readCount 18371)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:25:10 2019


------------------------------------------------------------

Current snapshot id: 98


DB clean time (in percent of total time) : 24.25 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |118|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |118|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:24:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 02:24:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:25:10 2019


------------------------------------------------------------
Current pipes in use: 197
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2751| 48| |
|
| 1|DDLOG | 2751| 48| |
|
| 2|BTCSCHED | 5504| 50| |
|
| 3|RESTART_ALL | 1101| 286| |
|
| 4|ENVCHECK | 16515| 20| |
|
| 5|AUTOABAP | 1101| 286| |
|
| 6|BGRFC_WATCHDOG | 1102| 286| |
|
| 7|AUTOTH | 1585| 56| |
|
| 8|AUTOCCMS | 5504| 50| |
|
| 9|AUTOSECURITY | 5504| 50| |
|
| 10|LOAD_CALCULATION | 329854| 1| |
|
| 11|SPOOLALRM | 5505| 50| |
|
| 12|CALL_DELAYED | 0| 17784| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 98 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-29668
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:25:10:819 2019


DpWpDynCreate: created new work process W12-29669

Wed Sep 18 02:25:12:547 2019


*** ERROR => DpHdlDeadWp: W1 (pid 29668) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29668) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 29668)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 29669) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29669) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 29669)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:25:30:814 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:25:50:815 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:26:10:815 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 29955

Wed Sep 18 02:26:17:846 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:26:30:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 29955 terminated

Wed Sep 18 02:26:50:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:27:10:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:27:30:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:27:50:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:28:10:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:28:30:819 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:28:50:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:29:10:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:29:30:821 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:29:50:822 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:30:10:822 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-31439
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-31440

Wed Sep 18 02:30:12:532 2019


*** ERROR => DpHdlDeadWp: W1 (pid 31439) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31439) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 31439)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 31440) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31440) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 31440)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:30:30:822 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:30:50:823 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:26:10 2019, skip new
snapshot

Wed Sep 18 02:31:10:823 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 99 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:31:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:31:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10941615, readCount 10941615)


UPD : 0 (peak 31, writeCount 2343, readCount 2343)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080467, readCount 1080467)
SPO : 0 (peak 2, writeCount 12130, readCount 12130)
UP2 : 0 (peak 1, writeCount 1162, readCount 1162)
DISP: 0 (peak 67, writeCount 420804, readCount 420804)
GW : 0 (peak 45, writeCount 9980761, readCount 9980761)
ICM : 1 (peak 186, writeCount 198753, readCount 198752)
LWP : 0 (peak 15, writeCount 18388, readCount 18388)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Dump of queue <IcmanQueue> in slot 2 (1 requests, in use, port=27708):
-1 <- 122 (rq_id 29360469, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:31:10 2019


------------------------------------------------------------

Current snapshot id: 99


DB clean time (in percent of total time) : 24.26 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |120|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |120|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:30:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 02:30:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:31:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2754| 48| |
|
| 1|DDLOG | 2754| 48| |
|
| 2|BTCSCHED | 5510| 50| |
|
| 3|RESTART_ALL | 1102| 226| |
|
| 4|ENVCHECK | 16533| 20| |
|
| 5|AUTOABAP | 1102| 226| |
|
| 6|BGRFC_WATCHDOG | 1103| 226| |
|
| 7|AUTOTH | 1591| 56| |
|
| 8|AUTOCCMS | 5510| 50| |
|
| 9|AUTOSECURITY | 5510| 50| |
|
| 10|LOAD_CALCULATION | 330213| 1| |
|
| 11|SPOOLALRM | 5511| 50| |
|
| 12|CALL_DELAYED | 0| 17424| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 99 (Reason: Workprocess 1 died / Time: Wed Sep 18


02:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:31:30:824 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:31:50:824 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:32:10:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 31910

Wed Sep 18 02:32:17:353 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:32:30:826 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 31910 terminated

Wed Sep 18 02:32:50:826 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:33:10:827 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:33:30:828 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:33:50:828 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
Wed Sep 18 02:34:10:828 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:34:30:829 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:34:50:830 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:35:10:830 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-874
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-875

Wed Sep 18 02:35:12:544 2019


*** ERROR => DpHdlDeadWp: W1 (pid 874) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=874) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 874)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 875) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=875) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 875)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:35:30:831 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:35:50:831 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:36:10:832 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:36:30:833 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:36:50:833 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:32:10 2019, skip new
snapshot

Wed Sep 18 02:37:10:833 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 100 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:37:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10942488, readCount 10942488)


UPD : 0 (peak 31, writeCount 2344, readCount 2344)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080471, readCount 1080471)
SPO : 0 (peak 2, writeCount 12143, readCount 12143)
UP2 : 0 (peak 1, writeCount 1163, readCount 1163)
DISP: 0 (peak 67, writeCount 420845, readCount 420845)
GW : 0 (peak 45, writeCount 9981365, readCount 9981365)
ICM : 0 (peak 186, writeCount 198780, readCount 198780)
LWP : 0 (peak 15, writeCount 18403, readCount 18403)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:37:10 2019


------------------------------------------------------------

Current snapshot id: 100


DB clean time (in percent of total time) : 24.26 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |121|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |121|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:36:58|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Wed Sep 18 02:36:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:37:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2757| 48| |
|
| 1|DDLOG | 2757| 48| |
|
| 2|BTCSCHED | 5516| 50| |
|
| 3|RESTART_ALL | 1103| 166| |
|
| 4|ENVCHECK | 16551| 20| |
|
| 5|AUTOABAP | 1103| 166| |
|
| 6|BGRFC_WATCHDOG | 1104| 166| |
|
| 7|AUTOTH | 1597| 56| |
|
| 8|AUTOCCMS | 5516| 50| |
|
| 9|AUTOSECURITY | 5516| 50| |
|
| 10|LOAD_CALCULATION | 330572| 1| |
|
| 11|SPOOLALRM | 5517| 50| |
|
| 12|CALL_DELAYED | 0| 17064| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 100 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:37:30:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:37:50:835 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:38:10:835 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 1723
Wed Sep 18 02:38:17:701 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:38:30:836 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 1723 terminated

Wed Sep 18 02:38:50:837 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:39:10:838 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:39:30:838 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:39:50:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:40:10:840 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-2812
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-2813
Wed Sep 18 02:40:12:512 2019
*** ERROR => DpHdlDeadWp: W1 (pid 2812) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2812) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 2812)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 2813) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2813) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 2813)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:40:30:840 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:40:50:841 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:41:10:842 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:41:30:842 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:41:50:843 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:42:10:844 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:42:30:844 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:42:50:845 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:38:10 2019, skip new
snapshot

Wed Sep 18 02:43:10:846 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 101 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:43:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10943514, readCount 10943514)


UPD : 0 (peak 31, writeCount 2346, readCount 2346)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080479, readCount 1080479)
SPO : 0 (peak 2, writeCount 12157, readCount 12157)
UP2 : 0 (peak 1, writeCount 1165, readCount 1165)
DISP: 0 (peak 67, writeCount 420886, readCount 420886)
GW : 0 (peak 45, writeCount 9982135, readCount 9982135)
ICM : 1 (peak 186, writeCount 198809, readCount 198808)
LWP : 2 (peak 15, writeCount 18433, readCount 18431)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Dump of queue <IcmanQueue> in slot 2 (1 requests, in use, port=27708):
-1 <- 25 (rq_id 29365798, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:43:10 2019


------------------------------------------------------------

Current snapshot id: 101


DB clean time (in percent of total time) : 24.27 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |122|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |122|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:42:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Wed Sep 18 02:42:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:43:10 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2760| 48| |
|
| 1|DDLOG | 2760| 48| |
|
| 2|BTCSCHED | 5522| 50| |
|
| 3|RESTART_ALL | 1104| 106| |
|
| 4|ENVCHECK | 16569| 20| |
|
| 5|AUTOABAP | 1104| 106| |
|
| 6|BGRFC_WATCHDOG | 1105| 106| |
|
| 7|AUTOTH | 1603| 56| |
|
| 8|AUTOCCMS | 5522| 50| |
|
| 9|AUTOSECURITY | 5522| 50| |
|
| 10|LOAD_CALCULATION | 330930| 1| |
|
| 11|SPOOLALRM | 5523| 50| |
|
| 12|CALL_DELAYED | 0| 16704| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 101 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:43:30:846 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:43:50:847 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:44:10:847 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 3894

Wed Sep 18 02:44:17:847 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:44:30:848 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 3894 terminated

Wed Sep 18 02:44:50:848 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:45:10:849 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-4642
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-4643

Wed Sep 18 02:45:12:578 2019


*** ERROR => DpHdlDeadWp: W1 (pid 4642) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4642) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 4642)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 4643) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4643) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 4643)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:45:30:850 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:45:50:850 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:46:10:851 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:46:30:852 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:46:50:852 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:47:10:853 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:47:30:853 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:47:50:854 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:48:10:855 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:48:30:855 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:48:50:856 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:44:10 2019, skip new
snapshot

Wed Sep 18 02:49:10:857 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 102 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:49:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10944355, readCount 10944355)


UPD : 0 (peak 31, writeCount 2347, readCount 2347)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080483, readCount 1080483)
SPO : 0 (peak 2, writeCount 12170, readCount 12170)
UP2 : 0 (peak 1, writeCount 1166, readCount 1166)
DISP: 0 (peak 67, writeCount 420927, readCount 420927)
GW : 0 (peak 45, writeCount 9982705, readCount 9982705)
ICM : 0 (peak 186, writeCount 198836, readCount 198836)
LWP : 2 (peak 15, writeCount 18448, readCount 18446)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:49:10 2019


------------------------------------------------------------

Current snapshot id: 102


DB clean time (in percent of total time) : 24.28 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |123|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |123|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:48:58|2 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 2|Wed Sep 18 02:48:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:49:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2763| 48| |
|
| 1|DDLOG | 2763| 48| |
|
| 2|BTCSCHED | 5528| 50| |
|
| 3|RESTART_ALL | 1105| 46| |
|
| 4|ENVCHECK | 16587| 20| |
|
| 5|AUTOABAP | 1105| 46| |
|
| 6|BGRFC_WATCHDOG | 1106| 46| |
|
| 7|AUTOTH | 1609| 56| |
|
| 8|AUTOCCMS | 5528| 50| |
|
| 9|AUTOSECURITY | 5528| 50| |
|
| 10|LOAD_CALCULATION | 331289| 1| |
|
| 11|SPOOLALRM | 5529| 50| |
|
| 12|CALL_DELAYED | 0| 16344| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 102 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:49:30:857 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:49:50:858 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:50:10:859 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-6045
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-6046
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 6047

Wed Sep 18 02:50:12:573 2019


*** ERROR => DpHdlDeadWp: W1 (pid 6045) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6045) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 6045)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 6046) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=6046) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 6046)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:50:17:606 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:50:30:859 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 6047 terminated

Wed Sep 18 02:50:50:860 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:51:10:861 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:51:30:861 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:51:50:862 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
Wed Sep 18 02:52:10:862 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:52:30:862 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:52:50:863 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:53:10:863 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:53:30:864 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:53:50:864 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:54:10:865 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:54:30:865 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:54:50:866 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:50:10 2019, skip new
snapshot

Wed Sep 18 02:55:10:867 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 103 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:55:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 02:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 02:55:10 2019


------------------------------------------------------------
Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10945174, readCount 10945174)


UPD : 0 (peak 31, writeCount 2348, readCount 2348)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080487, readCount 1080487)
SPO : 0 (peak 2, writeCount 12183, readCount 12183)
UP2 : 0 (peak 1, writeCount 1167, readCount 1167)
DISP: 0 (peak 67, writeCount 420972, readCount 420972)
GW : 0 (peak 45, writeCount 9983276, readCount 9983276)
ICM : 0 (peak 186, writeCount 198863, readCount 198863)
LWP : 2 (peak 15, writeCount 18463, readCount 18461)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 02:55:10 2019


------------------------------------------------------------

Current snapshot id: 103


DB clean time (in percent of total time) : 24.29 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |124|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |124|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 02:55:10 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |02:54:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 02:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 02:54:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 02:55:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 02:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2766| 48| |
|
| 1|DDLOG | 2766| 48| |
|
| 2|BTCSCHED | 5534| 50| |
|
| 3|RESTART_ALL | 1107| 286| |
|
| 4|ENVCHECK | 16605| 20| |
|
| 5|AUTOABAP | 1107| 286| |
|
| 6|BGRFC_WATCHDOG | 1108| 286| |
|
| 7|AUTOTH | 1615| 56| |
|
| 8|AUTOCCMS | 5534| 50| |
|
| 9|AUTOSECURITY | 5534| 50| |
|
| 10|LOAD_CALCULATION | 331648| 1| |
|
| 11|SPOOLALRM | 5535| 50| |
|
| 12|CALL_DELAYED | 0| 15984| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 103 (Reason: Workprocess 1 died / Time: Wed Sep 18
02:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-7948
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:55:10:873 2019


DpWpDynCreate: created new work process W12-7949

Wed Sep 18 02:55:12:570 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7948) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7948) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7948)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7949) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7949) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7949)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:55:30:867 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:55:50:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 02:56:10:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 8218

Wed Sep 18 02:56:18:211 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:56:30:869 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 8218 terminated

Wed Sep 18 02:56:50:869 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:57:10:870 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:57:30:870 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:57:50:871 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:58:10:871 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:58:30:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:58:50:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:59:10:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:59:30:873 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 02:59:50:873 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 03:00:10:874 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-9684
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-9685

Wed Sep 18 03:00:12:603 2019


*** ERROR => DpHdlDeadWp: W1 (pid 9684) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9684) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 9684)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 9685) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9685) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 9685)
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 03:00:30:874 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 03:00:50:875 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 02:56:10 2019, skip new
snapshot

Wed Sep 18 03:01:10:875 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 104 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:01:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10946003, readCount 10946003)


UPD : 0 (peak 31, writeCount 2349, readCount 2349)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080491, readCount 1080491)
SPO : 0 (peak 2, writeCount 12196, readCount 12196)
UP2 : 0 (peak 1, writeCount 1168, readCount 1168)
DISP: 0 (peak 67, writeCount 421012, readCount 421012)
GW : 0 (peak 45, writeCount 9983834, readCount 9983834)
ICM : 0 (peak 186, writeCount 198890, readCount 198890)
LWP : 0 (peak 15, writeCount 18478, readCount 18478)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:01:10 2019


------------------------------------------------------------

Current snapshot id: 104


DB clean time (in percent of total time) : 24.30 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |126|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |126|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|
Found 2 active workprocesses
Total number of workprocesses is 16

Session Table Wed Sep 18 03:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:00:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 03:00:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:01:10 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2769| 48| |
|
| 1|DDLOG | 2769| 48| |
|
| 2|BTCSCHED | 5540| 50| |
|
| 3|RESTART_ALL | 1108| 226| |
|
| 4|ENVCHECK | 16623| 20| |
|
| 5|AUTOABAP | 1108| 226| |
|
| 6|BGRFC_WATCHDOG | 1109| 226| |
|
| 7|AUTOTH | 1621| 56| |
|
| 8|AUTOCCMS | 5540| 50| |
|
| 9|AUTOSECURITY | 5540| 50| |
|
| 10|LOAD_CALCULATION | 332006| 1| |
|
| 11|SPOOLALRM | 5541| 50| |
|
| 12|CALL_DELAYED | 0| 15624| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 104 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:01:30:876 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:01:50:876 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:02:10:877 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 13084

Wed Sep 18 03:02:17:919 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:02:30:877 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 13084 terminated

Wed Sep 18 03:02:50:878 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:03:10:879 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:03:30:879 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:03:50:880 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:04:10:880 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:04:30:881 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:04:50:881 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:05:10:881 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-25560
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-25561

Wed Sep 18 03:05:12:602 2019


*** ERROR => DpHdlDeadWp: W1 (pid 25560) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25560) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25560)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 25561) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25561) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 25561)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:05:30:882 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:05:50:883 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:06:10:883 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:06:30:884 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:06:50:884 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:02:10 2019, skip new
snapshot

Wed Sep 18 03:07:10:885 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 105 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:07:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Wed Sep 18 03:07:10 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10947066, readCount 10947066)


UPD : 0 (peak 31, writeCount 2350, readCount 2350)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080495, readCount 1080495)
SPO : 0 (peak 2, writeCount 12209, readCount 12209)
UP2 : 0 (peak 1, writeCount 1169, readCount 1169)
DISP: 0 (peak 67, writeCount 421053, readCount 421053)
GW : 0 (peak 45, writeCount 9984618, readCount 9984618)
ICM : 0 (peak 186, writeCount 198917, readCount 198917)
LWP : 0 (peak 15, writeCount 18493, readCount 18493)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:07:10 2019


------------------------------------------------------------

Current snapshot id: 105


DB clean time (in percent of total time) : 24.31 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |127|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |127|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:06:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:07:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 03:06:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:07:10 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2772| 48| |
|
| 1|DDLOG | 2772| 48| |
|
| 2|BTCSCHED | 5546| 50| |
|
| 3|RESTART_ALL | 1109| 166| |
|
| 4|ENVCHECK | 16641| 20| |
|
| 5|AUTOABAP | 1109| 166| |
|
| 6|BGRFC_WATCHDOG | 1110| 166| |
|
| 7|AUTOTH | 1627| 56| |
|
| 8|AUTOCCMS | 5546| 50| |
|
| 9|AUTOSECURITY | 5546| 50| |
|
| 10|LOAD_CALCULATION | 332365| 1| |
|
| 11|SPOOLALRM | 5547| 50| |
|
| 12|CALL_DELAYED | 0| 15264| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 105 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:07:30:885 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:07:50:886 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:08:10:886 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 5095

Wed Sep 18 03:08:17:782 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:08:30:886 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 5095 terminated

Wed Sep 18 03:08:50:887 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:09:10:888 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:09:30:888 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:09:50:889 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:10:10:889 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-8737
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-8738

Wed Sep 18 03:10:12:589 2019


*** ERROR => DpHdlDeadWp: W1 (pid 8737) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8737) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 8737)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 8738) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8738) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 8738)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
Wed Sep 18 03:10:30:890 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:10:50:890 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:11:10:890 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:11:30:891 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:11:50:892 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:12:10:892 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:12:30:892 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:12:50:893 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:08:10 2019, skip new
snapshot

Wed Sep 18 03:13:10:893 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 106 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:13:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:13:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:13:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10947907, readCount 10947907)


UPD : 0 (peak 31, writeCount 2352, readCount 2352)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080503, readCount 1080503)
SPO : 0 (peak 2, writeCount 12223, readCount 12223)
UP2 : 0 (peak 1, writeCount 1171, readCount 1171)
DISP: 0 (peak 67, writeCount 421094, readCount 421094)
GW : 0 (peak 45, writeCount 9985230, readCount 9985230)
ICM : 1 (peak 186, writeCount 198946, readCount 198945)
LWP : 2 (peak 15, writeCount 18523, readCount 18521)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:13:10 2019


------------------------------------------------------------

Current snapshot id: 106


DB clean time (in percent of total time) : 24.31 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |128|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |128|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:13:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:12:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:13:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 03:12:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:13:10 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:13:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2775| 48| |
|
| 1|DDLOG | 2775| 48| |
|
| 2|BTCSCHED | 5552| 50| |
|
| 3|RESTART_ALL | 1110| 106| |
|
| 4|ENVCHECK | 16659| 20| |
|
| 5|AUTOABAP | 1110| 106| |
|
| 6|BGRFC_WATCHDOG | 1111| 106| |
|
| 7|AUTOTH | 1633| 56| |
|
| 8|AUTOCCMS | 5552| 50| |
|
| 9|AUTOSECURITY | 5552| 50| |
|
| 10|LOAD_CALCULATION | 332724| 1| |
|
| 11|SPOOLALRM | 5553| 50| |
|
| 12|CALL_DELAYED | 0| 14904| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 106 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:13:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:13:30:894 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:13:50:895 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:14:10:895 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 10055

Wed Sep 18 03:14:18:358 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:14:30:896 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 10055 terminated
Wed Sep 18 03:14:50:896 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:15:10:897 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-10668
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-10669

Wed Sep 18 03:15:12:621 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10668) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10668) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10668)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10669) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10669) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10669)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:15:30:898 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:15:50:898 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:16:10:899 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:16:30:899 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:16:50:900 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:17:10:900 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:17:30:901 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:17:50:902 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:18:10:902 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:18:30:903 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:18:50:903 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:14:10 2019, skip new
snapshot

Wed Sep 18 03:19:10:904 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 107 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:19:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:19:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:19:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10948780, readCount 10948780)


UPD : 0 (peak 31, writeCount 2353, readCount 2353)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080507, readCount 1080507)
SPO : 0 (peak 2, writeCount 12236, readCount 12236)
UP2 : 0 (peak 1, writeCount 1172, readCount 1172)
DISP: 0 (peak 67, writeCount 421135, readCount 421135)
GW : 0 (peak 45, writeCount 9985830, readCount 9985830)
ICM : 0 (peak 186, writeCount 198973, readCount 198973)
LWP : 2 (peak 15, writeCount 18538, readCount 18536)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:19:10 2019


------------------------------------------------------------

Current snapshot id: 107


DB clean time (in percent of total time) : 24.32 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |129|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |129|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:19:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:18:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:19:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Wed Sep 18 03:18:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:19:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:19:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2778| 48| |
|
| 1|DDLOG | 2778| 48| |
|
| 2|BTCSCHED | 5558| 50| |
|
| 3|RESTART_ALL | 1111| 46| |
|
| 4|ENVCHECK | 16677| 20| |
|
| 5|AUTOABAP | 1111| 46| |
|
| 6|BGRFC_WATCHDOG | 1112| 46| |
|
| 7|AUTOTH | 1639| 56| |
|
| 8|AUTOCCMS | 5558| 50| |
|
| 9|AUTOSECURITY | 5558| 50| |
|
| 10|LOAD_CALCULATION | 333083| 1| |
|
| 11|SPOOLALRM | 5559| 50| |
|
| 12|CALL_DELAYED | 0| 14544| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 107 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:19:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:19:30:904 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:19:50:905 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:20:10:905 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-12181
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-12182
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 12183

Wed Sep 18 03:20:12:584 2019


*** ERROR => DpHdlDeadWp: W1 (pid 12181) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12181) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 12181)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 12182) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12182) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 12182)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:20:17:889 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:20:30:906 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 12183 terminated

Wed Sep 18 03:20:50:906 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:21:10:907 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:21:30:908 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:21:50:908 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:22:10:909 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:22:30:909 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
Wed Sep 18 03:22:50:910 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:23:10:910 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:23:30:911 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:23:50:912 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:24:10:912 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:24:30:913 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:24:50:913 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:20:10 2019, skip new
snapshot

Wed Sep 18 03:25:10:913 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
********** SERVER SNAPSHOT 108 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:25:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:25:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:25:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10949646, readCount 10949646)


UPD : 0 (peak 31, writeCount 2354, readCount 2354)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080511, readCount 1080511)
SPO : 0 (peak 2, writeCount 12249, readCount 12249)
UP2 : 0 (peak 1, writeCount 1173, readCount 1173)
DISP: 0 (peak 67, writeCount 421180, readCount 421180)
GW : 1 (peak 45, writeCount 9986421, readCount 9986420)
ICM : 1 (peak 186, writeCount 199000, readCount 198999)
LWP : 2 (peak 15, writeCount 18553, readCount 18551)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <IcmanQueue> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 83 (rq_id 29382798, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Dump of queue <IcmanQueue> in slot 2 (1 requests, in use, port=27708):
-1 <- 53 (rq_id 29382799, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:25:10 2019


------------------------------------------------------------

Current snapshot id: 108


DB clean time (in percent of total time) : 24.33 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |130|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |130|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:25:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:24:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:25:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Wed Sep 18 03:24:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:25:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:25:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2781| 48| |
|
| 1|DDLOG | 2781| 48| |
|
| 2|BTCSCHED | 5564| 50| |
|
| 3|RESTART_ALL | 1113| 286| |
|
| 4|ENVCHECK | 16695| 20| |
|
| 5|AUTOABAP | 1113| 286| |
|
| 6|BGRFC_WATCHDOG | 1114| 286| |
|
| 7|AUTOTH | 1645| 56| |
|
| 8|AUTOCCMS | 5564| 50| |
|
| 9|AUTOSECURITY | 5564| 50| |
|
| 10|LOAD_CALCULATION | 333441| 1| |
|
| 11|SPOOLALRM | 5565| 50| |
|
| 12|CALL_DELAYED | 0| 14184| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 108 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:25:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-14103
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:25:10:919 2019


DpWpDynCreate: created new work process W12-14104

Wed Sep 18 03:25:12:628 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14103) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14103) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14103)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14104) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14104) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14104)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:25:30:914 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:25:50:915 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:26:10:915 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 14292

Wed Sep 18 03:26:18:485 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:26:30:916 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 14292 terminated

Wed Sep 18 03:26:50:917 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:27:10:917 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:27:30:918 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:27:50:918 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:28:10:918 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:28:30:919 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:28:50:920 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:29:10:920 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:29:30:921 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:29:50:922 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:30:10:923 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-15821
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-15822

Wed Sep 18 03:30:12:407 2019


*** ERROR => DpHdlDeadWp: W1 (pid 15821) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15821) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 15821)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 15822) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15822) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 15822)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:30:30:923 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:30:50:924 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:26:10 2019, skip new
snapshot

Wed Sep 18 03:31:10:924 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 109 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:31:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:31:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Wed Sep 18 03:31:10 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10950456, readCount 10950456)


UPD : 0 (peak 31, writeCount 2355, readCount 2355)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080515, readCount 1080515)
SPO : 0 (peak 2, writeCount 12262, readCount 12262)
UP2 : 0 (peak 1, writeCount 1174, readCount 1174)
DISP: 0 (peak 67, writeCount 421221, readCount 421221)
GW : 0 (peak 45, writeCount 9986979, readCount 9986979)
ICM : 0 (peak 186, writeCount 199027, readCount 199027)
LWP : 0 (peak 15, writeCount 18568, readCount 18568)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:31:10 2019


------------------------------------------------------------

Current snapshot id: 109


DB clean time (in percent of total time) : 24.34 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |132|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |132|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:31:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:30:58|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:31:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Wed Sep 18 03:30:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:31:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:31:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2784| 48| |
|
| 1|DDLOG | 2784| 48| |
|
| 2|BTCSCHED | 5570| 50| |
|
| 3|RESTART_ALL | 1114| 226| |
|
| 4|ENVCHECK | 16713| 20| |
|
| 5|AUTOABAP | 1114| 226| |
|
| 6|BGRFC_WATCHDOG | 1115| 226| |
|
| 7|AUTOTH | 1651| 56| |
|
| 8|AUTOCCMS | 5570| 50| |
|
| 9|AUTOSECURITY | 5570| 50| |
|
| 10|LOAD_CALCULATION | 333799| 1| |
|
| 11|SPOOLALRM | 5571| 50| |
|
| 12|CALL_DELAYED | 0| 13824| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 109 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:31:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:31:30:925 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:31:50:925 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:32:10:926 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 16563

Wed Sep 18 03:32:18:359 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:32:30:926 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 16563 terminated

Wed Sep 18 03:32:50:927 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:33:10:927 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:33:30:928 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:33:50:929 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:34:10:929 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:34:30:930 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:34:50:930 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
Wed Sep 18 03:35:10:931 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-17694
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-17695

Wed Sep 18 03:35:12:636 2019


*** ERROR => DpHdlDeadWp: W1 (pid 17694) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17694) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 17694)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 17695) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17695) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 17695)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:35:30:932 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:35:50:932 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:36:10:933 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:36:30:934 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:36:50:934 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:32:10 2019, skip new
snapshot

Wed Sep 18 03:37:10:935 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 110 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:37:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:37:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:37:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10951315, readCount 10951315)


UPD : 0 (peak 31, writeCount 2356, readCount 2356)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080519, readCount 1080519)
SPO : 0 (peak 2, writeCount 12275, readCount 12275)
UP2 : 0 (peak 1, writeCount 1175, readCount 1175)
DISP: 0 (peak 67, writeCount 421262, readCount 421262)
GW : 0 (peak 45, writeCount 9987565, readCount 9987565)
ICM : 0 (peak 186, writeCount 199054, readCount 199054)
LWP : 0 (peak 15, writeCount 18583, readCount 18583)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:37:10 2019


------------------------------------------------------------

Current snapshot id: 110


DB clean time (in percent of total time) : 24.35 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |133|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 6|32125 |DIA |WP_RUN | | |norm|T14_U616_M0 |HTTP_NORM| | |
0|<HANDLE PLUGIN> |000| | |
|
| 12| |BTC |WP_KILL| |133|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:37:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T14_U616_M0 |000| |10.54.36.33 |03:37:10|6 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:36:58|6 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:37:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 6|Wed Sep 18 03:36:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:37:10 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:37:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2787| 48| |
|
| 1|DDLOG | 2787| 48| |
|
| 2|BTCSCHED | 5576| 50| |
|
| 3|RESTART_ALL | 1115| 166| |
|
| 4|ENVCHECK | 16731| 20| |
|
| 5|AUTOABAP | 1115| 166| |
|
| 6|BGRFC_WATCHDOG | 1116| 166| |
|
| 7|AUTOTH | 1657| 56| |
|
| 8|AUTOCCMS | 5576| 50| |
|
| 9|AUTOSECURITY | 5576| 50| |
|
| 10|LOAD_CALCULATION | 334158| 1| |
|
| 11|SPOOLALRM | 5577| 50| |
|
| 12|CALL_DELAYED | 0| 13464| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 110 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:37:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:37:30:936 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:37:50:936 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:38:10:937 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 18419

Wed Sep 18 03:38:17:497 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:38:30:937 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 18419 terminated
Wed Sep 18 03:38:50:938 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:39:10:938 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:39:30:939 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:39:50:939 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:40:10:940 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-19393
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-19394

Wed Sep 18 03:40:12:661 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19393) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19393) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19393)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19394) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19394) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19394)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:40:30:941 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:40:50:941 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:41:10:942 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:41:30:942 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:41:50:943 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:42:10:943 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:42:30:944 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:42:50:944 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:38:10 2019, skip new
snapshot

Wed Sep 18 03:43:10:945 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 111 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:43:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:43:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:43:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10952183, readCount 10952183)


UPD : 0 (peak 31, writeCount 2358, readCount 2358)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080527, readCount 1080527)
SPO : 0 (peak 2, writeCount 12289, readCount 12289)
UP2 : 0 (peak 1, writeCount 1177, readCount 1177)
DISP: 0 (peak 67, writeCount 421307, readCount 421307)
GW : 0 (peak 45, writeCount 9988141, readCount 9988141)
ICM : 1 (peak 186, writeCount 199083, readCount 199082)
LWP : 2 (peak 15, writeCount 18613, readCount 18611)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:43:10 2019


------------------------------------------------------------

Current snapshot id: 111


DB clean time (in percent of total time) : 24.36 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |134|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |134|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:43:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T39_U957_M0 |000| |SST-LAP-LEN0043 |03:42:54|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:42:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:43:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Wed Sep 18 03:42:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:43:10 2019


------------------------------------------------------------
Current pipes in use: 225
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:43:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2790| 48| |
|
| 1|DDLOG | 2790| 48| |
|
| 2|BTCSCHED | 5582| 50| |
|
| 3|RESTART_ALL | 1116| 106| |
|
| 4|ENVCHECK | 16749| 20| |
|
| 5|AUTOABAP | 1116| 106| |
|
| 6|BGRFC_WATCHDOG | 1117| 106| |
|
| 7|AUTOTH | 1663| 56| |
|
| 8|AUTOCCMS | 5582| 50| |
|
| 9|AUTOSECURITY | 5582| 50| |
|
| 10|LOAD_CALCULATION | 334517| 1| |
|
| 11|SPOOLALRM | 5583| 50| |
|
| 12|CALL_DELAYED | 0| 13104| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 111 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:43:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:43:30:945 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:43:50:946 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:44:10:947 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 20460

Wed Sep 18 03:44:17:945 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:44:30:948 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 20460 terminated

Wed Sep 18 03:44:50:948 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:45:10:949 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21238
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-21239

Wed Sep 18 03:45:12:668 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21238) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21238) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21238)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21239) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21239) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21239)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:45:30:950 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:45:50:950 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:46:10:951 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:46:30:951 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:46:50:952 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:47:10:953 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:47:30:953 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:47:50:954 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:48:10:954 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:48:30:955 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot

Wed Sep 18 03:48:50:955 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:44:10 2019, skip new
snapshot
Wed Sep 18 03:49:10:956 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 112 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:49:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:49:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:49:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10952983, readCount 10952983)


UPD : 0 (peak 31, writeCount 2359, readCount 2359)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080531, readCount 1080531)
SPO : 0 (peak 2, writeCount 12302, readCount 12302)
UP2 : 0 (peak 1, writeCount 1178, readCount 1178)
DISP: 0 (peak 67, writeCount 421349, readCount 421349)
GW : 0 (peak 45, writeCount 9988705, readCount 9988705)
ICM : 0 (peak 186, writeCount 199110, readCount 199110)
LWP : 2 (peak 15, writeCount 18628, readCount 18626)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:49:10 2019


------------------------------------------------------------

Current snapshot id: 112


DB clean time (in percent of total time) : 24.36 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |135|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |135|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:49:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:48:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:49:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 03:48:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:49:10 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:49:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2793| 48| |
|
| 1|DDLOG | 2793| 48| |
|
| 2|BTCSCHED | 5588| 50| |
|
| 3|RESTART_ALL | 1117| 46| |
|
| 4|ENVCHECK | 16767| 20| |
|
| 5|AUTOABAP | 1117| 46| |
|
| 6|BGRFC_WATCHDOG | 1118| 46| |
|
| 7|AUTOTH | 1669| 56| |
|
| 8|AUTOCCMS | 5588| 50| |
|
| 9|AUTOSECURITY | 5588| 50| |
|
| 10|LOAD_CALCULATION | 334876| 1| |
|
| 11|SPOOLALRM | 5589| 50| |
|
| 12|CALL_DELAYED | 0| 12744| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 112 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:49:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:49:30:957 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:49:50:957 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:50:10:957 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-22613
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-22614
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22615

Wed Sep 18 03:50:12:656 2019


*** ERROR => DpHdlDeadWp: W1 (pid 22613) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22613) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 22613)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 22614) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22614) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 22614)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:50:18:058 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
Wed Sep 18 03:50:30:958 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22615 terminated

Wed Sep 18 03:50:50:958 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:51:10:959 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:51:30:960 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:51:50:960 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:52:10:960 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:52:30:961 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:52:50:962 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:53:10:962 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:53:30:962 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:53:50:963 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:54:10:964 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:54:30:965 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:54:50:965 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:50:10 2019, skip new
snapshot

Wed Sep 18 03:55:10:966 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 113 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:55:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 03:55:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 03:55:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10953832, readCount 10953832)


UPD : 0 (peak 31, writeCount 2360, readCount 2360)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080535, readCount 1080535)
SPO : 0 (peak 2, writeCount 12315, readCount 12315)
UP2 : 0 (peak 1, writeCount 1179, readCount 1179)
DISP: 0 (peak 67, writeCount 421394, readCount 421394)
GW : 0 (peak 45, writeCount 9989270, readCount 9989270)
ICM : 0 (peak 186, writeCount 199137, readCount 199137)
LWP : 2 (peak 15, writeCount 18643, readCount 18641)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 03:55:10 2019


------------------------------------------------------------

Current snapshot id: 113


DB clean time (in percent of total time) : 24.37 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |136|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |136|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 03:55:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |03:54:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 03:55:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 03:54:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 03:55:10 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 03:55:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2796| 48| |
|
| 1|DDLOG | 2796| 48| |
|
| 2|BTCSCHED | 5594| 50| |
|
| 3|RESTART_ALL | 1119| 286| |
|
| 4|ENVCHECK | 16785| 20| |
|
| 5|AUTOABAP | 1119| 286| |
|
| 6|BGRFC_WATCHDOG | 1120| 286| |
|
| 7|AUTOTH | 1675| 56| |
|
| 8|AUTOCCMS | 5594| 50| |
|
| 9|AUTOSECURITY | 5594| 50| |
|
| 10|LOAD_CALCULATION | 335234| 1| |
|
| 11|SPOOLALRM | 5595| 50| |
|
| 12|CALL_DELAYED | 0| 12384| |
|
Found 13 periodic tasks

********** SERVER SNAPSHOT 113 (Reason: Workprocess 1 died / Time: Wed Sep 18
03:55:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-24658
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:55:10:971 2019


DpWpDynCreate: created new work process W12-24659

Wed Sep 18 03:55:12:674 2019


*** ERROR => DpHdlDeadWp: W1 (pid 24658) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24658) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 24658)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 24659) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24659) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 24659)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:55:30:966 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:55:50:967 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 03:56:10:967 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 24989

Wed Sep 18 03:56:17:459 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:56:30:968 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 24989 terminated

Wed Sep 18 03:56:50:968 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:57:10:968 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:57:30:969 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:57:50:970 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:58:10:970 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:58:30:971 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:58:50:972 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:59:10:972 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:59:30:973 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 03:59:50:973 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 04:00:10:973 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-26615
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-26616

Wed Sep 18 04:00:12:670 2019


*** ERROR => DpHdlDeadWp: W1 (pid 26615) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26615) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 26615)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 26616) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26616) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 26616)
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 04:00:30:974 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 04:00:50:974 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 03:56:10 2019, skip new
snapshot

Wed Sep 18 04:01:10:975 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 114 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:01:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:01:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:01:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10954635, readCount 10954635)


UPD : 0 (peak 31, writeCount 2361, readCount 2361)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080539, readCount 1080539)
SPO : 0 (peak 2, writeCount 12328, readCount 12328)
UP2 : 0 (peak 1, writeCount 1180, readCount 1180)
DISP: 0 (peak 67, writeCount 421435, readCount 421435)
GW : 0 (peak 45, writeCount 9989828, readCount 9989828)
ICM : 0 (peak 186, writeCount 199164, readCount 199164)
LWP : 0 (peak 15, writeCount 18658, readCount 18658)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:01:10 2019


------------------------------------------------------------

Current snapshot id: 114


DB clean time (in percent of total time) : 24.38 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |138|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |138|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:01:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:00:58|5 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 04:01:10 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 5|Wed Sep 18 04:00:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:01:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:01:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2799| 48| |
|
| 1|DDLOG | 2799| 48| |
|
| 2|BTCSCHED | 5600| 50| |
|
| 3|RESTART_ALL | 1120| 226| |
|
| 4|ENVCHECK | 16803| 20| |
|
| 5|AUTOABAP | 1120| 226| |
|
| 6|BGRFC_WATCHDOG | 1121| 226| |
|
| 7|AUTOTH | 1681| 56| |
|
| 8|AUTOCCMS | 5600| 50| |
|
| 9|AUTOSECURITY | 5600| 50| |
|
| 10|LOAD_CALCULATION | 335593| 1| |
|
| 11|SPOOLALRM | 5601| 50| |
|
| 12|CALL_DELAYED | 0| 12024| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 114 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:01:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:01:30:976 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:01:50:976 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:02:10:976 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 29815

Wed Sep 18 04:02:17:377 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:02:30:977 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 29815 terminated

Wed Sep 18 04:02:50:978 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:03:10:982 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:03:30:983 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:03:50:983 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:04:10:986 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:04:30:986 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:04:50:986 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:05:10:987 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-10296
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-10297

Wed Sep 18 04:05:12:770 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10296) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10296) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10296)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10297) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10297) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10297)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:05:30:988 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:05:50:988 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:06:10:989 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:06:30:990 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:06:50:991 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:02:10 2019, skip new
snapshot

Wed Sep 18 04:07:10:991 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
********** SERVER SNAPSHOT 115 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:07:10 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:07:10 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:07:10 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10955705, readCount 10955705)


UPD : 0 (peak 31, writeCount 2362, readCount 2362)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080543, readCount 1080543)
SPO : 0 (peak 2, writeCount 12341, readCount 12341)
UP2 : 0 (peak 1, writeCount 1181, readCount 1181)
DISP: 0 (peak 67, writeCount 421476, readCount 421476)
GW : 0 (peak 45, writeCount 9990618, readCount 9990618)
ICM : 1 (peak 186, writeCount 199191, readCount 199190)
LWP : 0 (peak 15, writeCount 18673, readCount 18673)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:07:10 2019


------------------------------------------------------------

Current snapshot id: 115


DB clean time (in percent of total time) : 24.39 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |139|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |139|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:07:10 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:06:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB
RFC-Connection Table (1 entries) Wed Sep 18 04:07:10 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 04:06:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:07:10 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:07:10 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2802| 48| |
|
| 1|DDLOG | 2802| 48| |
|
| 2|BTCSCHED | 5606| 50| |
|
| 3|RESTART_ALL | 1121| 166| |
|
| 4|ENVCHECK | 16821| 20| |
|
| 5|AUTOABAP | 1121| 166| |
|
| 6|BGRFC_WATCHDOG | 1122| 166| |
|
| 7|AUTOTH | 1687| 56| |
|
| 8|AUTOCCMS | 5606| 50| |
|
| 9|AUTOSECURITY | 5606| 50| |
|
| 10|LOAD_CALCULATION | 335952| 1| |
|
| 11|SPOOLALRM | 5607| 50| |
|
| 12|CALL_DELAYED | 0| 11664| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 115 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:07:10 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]

Wed Sep 18 04:07:11:000 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:07:30:992 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:07:50:992 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:08:10:993 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:08:11:000 2019


DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 24533

Wed Sep 18 04:08:17:755 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:08:30:993 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 24533 terminated

Wed Sep 18 04:08:50:994 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:09:10:995 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:09:30:995 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:09:50:996 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:10:10:996 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-25565
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-25566

Wed Sep 18 04:10:12:499 2019


*** ERROR => DpHdlDeadWp: W1 (pid 25565) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25565) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25565)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 25566) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25566) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 25566)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:10:30:997 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:10:50:998 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:11:10:998 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:11:30:999 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:11:50:999 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:12:11:000 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:12:31:000 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:12:51:001 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:08:10 2019, skip new
snapshot

Wed Sep 18 04:13:11:002 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 116 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:13:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:13:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:13:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10956572, readCount 10956572)


UPD : 0 (peak 31, writeCount 2364, readCount 2364)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080551, readCount 1080551)
SPO : 0 (peak 2, writeCount 12355, readCount 12355)
UP2 : 0 (peak 1, writeCount 1183, readCount 1183)
DISP: 0 (peak 67, writeCount 421517, readCount 421517)
GW : 0 (peak 45, writeCount 9991236, readCount 9991236)
ICM : 0 (peak 186, writeCount 199220, readCount 199220)
LWP : 2 (peak 15, writeCount 18703, readCount 18701)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:13:11 2019


------------------------------------------------------------

Current snapshot id: 116


DB clean time (in percent of total time) : 24.39 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |140|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |140|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:13:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:12:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB
RFC-Connection Table (1 entries) Wed Sep 18 04:13:11 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Wed Sep 18 04:12:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:13:11 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:13:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2805| 47| |
|
| 1|DDLOG | 2805| 47| |
|
| 2|BTCSCHED | 5612| 49| |
|
| 3|RESTART_ALL | 1122| 105| |
|
| 4|ENVCHECK | 16839| 20| |
|
| 5|AUTOABAP | 1122| 105| |
|
| 6|BGRFC_WATCHDOG | 1123| 105| |
|
| 7|AUTOTH | 1693| 55| |
|
| 8|AUTOCCMS | 5612| 49| |
|
| 9|AUTOSECURITY | 5612| 49| |
|
| 10|LOAD_CALCULATION | 336310| 0| |
|
| 11|SPOOLALRM | 5613| 49| |
|
| 12|CALL_DELAYED | 0| 11303| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 116 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:13:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:13:31:003 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:13:51:004 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:14:11:005 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 26652

Wed Sep 18 04:14:17:629 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:14:31:005 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 26652 terminated

Wed Sep 18 04:14:51:005 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:15:11:006 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-27285
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-27286

Wed Sep 18 04:15:12:730 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27285) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27285) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27285)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27286) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27286) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27286)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:15:31:006 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:15:51:007 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:16:11:007 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:16:31:008 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:16:51:008 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:17:11:008 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:17:31:009 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:17:51:010 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:18:11:010 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:18:31:011 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:18:51:011 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:14:11 2019, skip new
snapshot

Wed Sep 18 04:19:11:012 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 117 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:19:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:19:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:19:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10957455, readCount 10957455)


UPD : 0 (peak 31, writeCount 2365, readCount 2365)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080555, readCount 1080555)
SPO : 0 (peak 2, writeCount 12368, readCount 12368)
UP2 : 0 (peak 1, writeCount 1184, readCount 1184)
DISP: 0 (peak 67, writeCount 421558, readCount 421558)
GW : 0 (peak 45, writeCount 9991860, readCount 9991860)
ICM : 1 (peak 186, writeCount 199247, readCount 199246)
LWP : 2 (peak 15, writeCount 18718, readCount 18716)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Dump of queue <IcmanQueue> in slot 2 (1 requests, in use, port=27708):
-1 <- 287 (rq_id 29404456, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:19:11 2019


------------------------------------------------------------

Current snapshot id: 117


DB clean time (in percent of total time) : 24.40 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |141|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |141|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:19:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:18:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 04:19:11 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 04:18:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:19:11 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:19:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2808| 47| |
|
| 1|DDLOG | 2808| 47| |
|
| 2|BTCSCHED | 5618| 49| |
|
| 3|RESTART_ALL | 1123| 45| |
|
| 4|ENVCHECK | 16857| 20| |
|
| 5|AUTOABAP | 1123| 45| |
|
| 6|BGRFC_WATCHDOG | 1124| 45| |
|
| 7|AUTOTH | 1699| 55| |
|
| 8|AUTOCCMS | 5618| 49| |
|
| 9|AUTOSECURITY | 5618| 49| |
|
| 10|LOAD_CALCULATION | 336669| 0| |
|
| 11|SPOOLALRM | 5619| 49| |
|
| 12|CALL_DELAYED | 0| 10943| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 117 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:19:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:19:31:012 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:19:51:012 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:20:11:012 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-28722
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-28723
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 28724

Wed Sep 18 04:20:12:552 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28722) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28722) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28722)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 28723) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28723) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28723)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:20:17:778 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:20:31:013 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 28724 terminated
Wed Sep 18 04:20:51:014 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:21:11:015 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:21:31:016 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:21:51:016 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:22:11:017 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:22:31:017 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:22:51:017 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:23:11:018 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:23:31:018 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:23:51:019 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:24:11:020 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:24:31:020 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:24:51:020 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:20:11 2019, skip new
snapshot

Wed Sep 18 04:25:11:021 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 118 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:25:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:25:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:25:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10958491, readCount 10958491)


UPD : 0 (peak 31, writeCount 2366, readCount 2366)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080559, readCount 1080559)
SPO : 0 (peak 2, writeCount 12381, readCount 12381)
UP2 : 0 (peak 1, writeCount 1185, readCount 1185)
DISP: 0 (peak 67, writeCount 421602, readCount 421602)
GW : 0 (peak 45, writeCount 9992645, readCount 9992645)
ICM : 1 (peak 186, writeCount 199274, readCount 199273)
LWP : 2 (peak 15, writeCount 18733, readCount 18731)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Dump of queue <IcmanQueue> in slot 2 (1 requests, in use, port=27708):
-1 <- 273 (rq_id 29407395, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:25:11 2019


------------------------------------------------------------

Current snapshot id: 118


DB clean time (in percent of total time) : 24.41 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |142|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |142|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:25:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:24:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 04:25:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Wed Sep 18 04:24:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:25:11 2019


------------------------------------------------------------
Current pipes in use: 199
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:25:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2811| 47| |
|
| 1|DDLOG | 2811| 47| |
|
| 2|BTCSCHED | 5624| 49| |
|
| 3|RESTART_ALL | 1125| 285| |
|
| 4|ENVCHECK | 16875| 20| |
|
| 5|AUTOABAP | 1125| 285| |
|
| 6|BGRFC_WATCHDOG | 1126| 285| |
|
| 7|AUTOTH | 1705| 55| |
|
| 8|AUTOCCMS | 5624| 49| |
|
| 9|AUTOSECURITY | 5624| 49| |
|
| 10|LOAD_CALCULATION | 337027| 0| |
|
| 11|SPOOLALRM | 5625| 49| |
|
| 12|CALL_DELAYED | 0| 10583| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 118 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:25:11 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
DpWpDynCreate: created new work process W1-30414
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:25:11:027 2019


DpWpDynCreate: created new work process W12-30415

Wed Sep 18 04:25:12:558 2019


*** ERROR => DpHdlDeadWp: W1 (pid 30414) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30414) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 30414)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 30415) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30415) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 30415)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:25:31:022 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:25:51:022 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:26:11:023 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 30677

Wed Sep 18 04:26:17:915 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:26:31:023 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 30677 terminated

Wed Sep 18 04:26:51:024 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:27:11:024 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:27:31:025 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:27:51:025 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:28:11:026 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:28:31:027 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:28:51:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:29:11:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:29:31:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:29:51:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:30:11:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-32252
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-32253

Wed Sep 18 04:30:12:609 2019


*** ERROR => DpHdlDeadWp: W1 (pid 32252) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=32252) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 32252)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 32253) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=32253) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 32253)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:30:31:028 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot

Wed Sep 18 04:30:51:029 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:26:11 2019, skip new
snapshot
Wed Sep 18 04:31:11:029 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 119 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:31:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:31:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:31:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10959330, readCount 10959330)


UPD : 0 (peak 31, writeCount 2367, readCount 2367)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080563, readCount 1080563)
SPO : 0 (peak 2, writeCount 12394, readCount 12394)
UP2 : 0 (peak 1, writeCount 1186, readCount 1186)
DISP: 0 (peak 67, writeCount 421643, readCount 421643)
GW : 0 (peak 45, writeCount 9993211, readCount 9993211)
ICM : 0 (peak 186, writeCount 199301, readCount 199301)
LWP : 0 (peak 15, writeCount 18748, readCount 18748)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:31:11 2019


------------------------------------------------------------

Current snapshot id: 119


DB clean time (in percent of total time) : 24.42 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |144|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |144|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:31:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:30:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 04:31:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 04:30:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:31:11 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:31:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2814| 47| |
|
| 1|DDLOG | 2814| 47| |
|
| 2|BTCSCHED | 5630| 49| |
|
| 3|RESTART_ALL | 1126| 225| |
|
| 4|ENVCHECK | 16893| 20| |
|
| 5|AUTOABAP | 1126| 225| |
|
| 6|BGRFC_WATCHDOG | 1127| 225| |
|
| 7|AUTOTH | 1711| 55| |
|
| 8|AUTOCCMS | 5630| 49| |
|
| 9|AUTOSECURITY | 5630| 49| |
|
| 10|LOAD_CALCULATION | 337386| 0| |
|
| 11|SPOOLALRM | 5631| 49| |
|
| 12|CALL_DELAYED | 0| 10223| |
|
Found 13 periodic tasks

********** SERVER SNAPSHOT 119 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:31:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:31:31:030 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:31:51:031 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:32:11:032 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 352

Wed Sep 18 04:32:17:856 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:32:31:032 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 352 terminated

Wed Sep 18 04:32:51:033 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:33:11:034 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:33:31:035 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:33:51:036 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:34:11:036 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:34:31:036 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:34:51:036 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:35:11:037 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-1516
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-1517

Wed Sep 18 04:35:12:508 2019


*** ERROR => DpHdlDeadWp: W1 (pid 1516) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1516) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 1516)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 1517) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1517) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 1517)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:35:31:037 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:35:51:038 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:36:11:039 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:36:31:040 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:36:51:041 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:32:11 2019, skip new
snapshot

Wed Sep 18 04:37:11:042 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 120 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:37:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:37:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:37:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10960188, readCount 10960188)


UPD : 0 (peak 31, writeCount 2368, readCount 2368)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080567, readCount 1080567)
SPO : 0 (peak 2, writeCount 12407, readCount 12407)
UP2 : 0 (peak 1, writeCount 1187, readCount 1187)
DISP: 0 (peak 67, writeCount 421684, readCount 421684)
GW : 1 (peak 45, writeCount 9993791, readCount 9993790)
ICM : 0 (peak 186, writeCount 199328, readCount 199328)
LWP : 0 (peak 15, writeCount 18763, readCount 18763)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 53 (rq_id 29412013, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:37:11 2019


------------------------------------------------------------

Current snapshot id: 120


DB clean time (in percent of total time) : 24.42 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |145|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_RUN | | |norm|T24_U4782_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |000| | |
|
| 12| |BTC |WP_KILL| |145|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:37:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T24_U4782_M0 |000| |10.54.36.33 |04:37:10|2 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:36:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 25 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 04:37:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 04:36:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:37:11 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:37:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2817| 47| |
|
| 1|DDLOG | 2817| 47| |
|
| 2|BTCSCHED | 5636| 49| |
|
| 3|RESTART_ALL | 1127| 165| |
|
| 4|ENVCHECK | 16911| 20| |
|
| 5|AUTOABAP | 1127| 165| |
|
| 6|BGRFC_WATCHDOG | 1128| 165| |
|
| 7|AUTOTH | 1717| 55| |
|
| 8|AUTOCCMS | 5636| 49| |
|
| 9|AUTOSECURITY | 5636| 49| |
|
| 10|LOAD_CALCULATION | 337745| 0| |
|
| 11|SPOOLALRM | 5637| 49| |
|
| 12|CALL_DELAYED | 0| 9863| |
|
Found 13 periodic tasks

********** SERVER SNAPSHOT 120 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:37:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:37:31:043 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:37:51:043 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:38:11:044 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 2821

Wed Sep 18 04:38:18:184 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:38:31:044 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 2821 terminated

Wed Sep 18 04:38:51:045 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:39:11:046 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:39:31:047 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:39:51:047 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:40:11:048 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-3694
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-3695

Wed Sep 18 04:40:12:779 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3694) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3694) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3694)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3695) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3695) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 3695)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:40:31:049 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:40:51:050 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:41:11:050 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:41:31:051 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:41:51:052 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:42:11:052 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:42:31:053 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:42:51:054 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:38:11 2019, skip new
snapshot

Wed Sep 18 04:43:11:055 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 121 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:43:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:43:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:43:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10961024, readCount 10961024)


UPD : 0 (peak 31, writeCount 2370, readCount 2370)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080575, readCount 1080575)
SPO : 0 (peak 2, writeCount 12421, readCount 12421)
UP2 : 0 (peak 1, writeCount 1189, readCount 1189)
DISP: 0 (peak 67, writeCount 421725, readCount 421725)
GW : 1 (peak 45, writeCount 9994363, readCount 9994362)
ICM : 0 (peak 186, writeCount 199357, readCount 199357)
LWP : 2 (peak 15, writeCount 18793, readCount 18791)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 287 (rq_id 29414372, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:43:11 2019


------------------------------------------------------------

Current snapshot id: 121


DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |146|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |146|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:43:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:42:58|0 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB
RFC-Connection Table (1 entries) Wed Sep 18 04:43:11 2019
------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 0|Wed Sep 18 04:42:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:43:11 2019


------------------------------------------------------------
Current pipes in use: 221
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:43:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2820| 47| |
|
| 1|DDLOG | 2820| 47| |
|
| 2|BTCSCHED | 5642| 49| |
|
| 3|RESTART_ALL | 1128| 105| |
|
| 4|ENVCHECK | 16929| 20| |
|
| 5|AUTOABAP | 1128| 105| |
|
| 6|BGRFC_WATCHDOG | 1129| 105| |
|
| 7|AUTOTH | 1723| 55| |
|
| 8|AUTOCCMS | 5642| 49| |
|
| 9|AUTOSECURITY | 5642| 49| |
|
| 10|LOAD_CALCULATION | 338104| 0| |
|
| 11|SPOOLALRM | 5643| 49| |
|
| 12|CALL_DELAYED | 0| 9503| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 121 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:43:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:43:31:055 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:43:51:056 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:44:11:057 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 4736

Wed Sep 18 04:44:18:210 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:44:31:057 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 4736 terminated

Wed Sep 18 04:44:51:058 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:45:11:059 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-5394
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-5395

Wed Sep 18 04:45:12:617 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5394) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5394) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5394)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5395) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5395) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5395)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:45:31:059 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:45:51:059 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:46:11:060 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:46:31:061 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:46:51:062 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
Wed Sep 18 04:47:11:062 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:47:31:062 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:47:51:063 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:48:11:063 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:48:31:064 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:48:51:065 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:44:11 2019, skip new
snapshot

Wed Sep 18 04:49:11:066 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 122 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:49:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:49:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:49:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10961904, readCount 10961904)


UPD : 0 (peak 31, writeCount 2371, readCount 2371)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080579, readCount 1080579)
SPO : 0 (peak 2, writeCount 12434, readCount 12434)
UP2 : 0 (peak 1, writeCount 1190, readCount 1190)
DISP: 0 (peak 67, writeCount 421766, readCount 421766)
GW : 0 (peak 45, writeCount 9994987, readCount 9994987)
ICM : 0 (peak 186, writeCount 199384, readCount 199384)
LWP : 2 (peak 15, writeCount 18808, readCount 18806)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:49:11 2019


------------------------------------------------------------

Current snapshot id: 122


DB clean time (in percent of total time) : 24.44 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |147|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |147|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:49:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:48:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 04:49:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 04:48:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:49:11 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:49:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2823| 47| |
|
| 1|DDLOG | 2823| 47| |
|
| 2|BTCSCHED | 5648| 49| |
|
| 3|RESTART_ALL | 1129| 45| |
|
| 4|ENVCHECK | 16947| 20| |
|
| 5|AUTOABAP | 1129| 45| |
|
| 6|BGRFC_WATCHDOG | 1130| 45| |
|
| 7|AUTOTH | 1729| 55| |
|
| 8|AUTOCCMS | 5648| 49| |
|
| 9|AUTOSECURITY | 5648| 49| |
|
| 10|LOAD_CALCULATION | 338463| 0| |
|
| 11|SPOOLALRM | 5649| 49| |
|
| 12|CALL_DELAYED | 0| 9143| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 122 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:49:11 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:49:31:066 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:49:51:067 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:50:11:067 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-7022
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-7023
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 7024

Wed Sep 18 04:50:12:205 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7022) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7022) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7022)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7023) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7023) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7023)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:50:18:074 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:50:31:067 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 7024 terminated

Wed Sep 18 04:50:51:068 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:51:11:068 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:51:31:069 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:51:51:069 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:52:11:070 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:52:31:071 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:52:51:072 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:53:11:073 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:53:31:073 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:53:51:074 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:54:11:074 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:54:31:074 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:54:51:074 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:50:11 2019, skip new
snapshot

Wed Sep 18 04:55:11:075 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 123 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:55:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 04:55:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 04:55:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10962750, readCount 10962750)


UPD : 0 (peak 31, writeCount 2372, readCount 2372)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080583, readCount 1080583)
SPO : 0 (peak 2, writeCount 12447, readCount 12447)
UP2 : 0 (peak 1, writeCount 1191, readCount 1191)
DISP: 0 (peak 67, writeCount 421811, readCount 421811)
GW : 0 (peak 45, writeCount 9995558, readCount 9995558)
ICM : 0 (peak 186, writeCount 199411, readCount 199411)
LWP : 2 (peak 15, writeCount 18823, readCount 18821)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 04:55:11 2019


------------------------------------------------------------

Current snapshot id: 123


DB clean time (in percent of total time) : 24.45 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |148|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |148|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 04:55:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:54:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 04:55:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 04:54:58 2019 |
Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 04:55:11 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 04:55:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2826| 47| |
|
| 1|DDLOG | 2826| 47| |
|
| 2|BTCSCHED | 5654| 49| |
|
| 3|RESTART_ALL | 1131| 285| |
|
| 4|ENVCHECK | 16965| 20| |
|
| 5|AUTOABAP | 1131| 285| |
|
| 6|BGRFC_WATCHDOG | 1132| 285| |
|
| 7|AUTOTH | 1735| 55| |
|
| 8|AUTOCCMS | 5654| 49| |
|
| 9|AUTOSECURITY | 5654| 49| |
|
| 10|LOAD_CALCULATION | 338821| 0| |
|
| 11|SPOOLALRM | 5655| 49| |
|
| 12|CALL_DELAYED | 0| 8783| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 123 (Reason: Workprocess 1 died / Time: Wed Sep 18
04:55:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-8775
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Wed Sep 18 04:55:11:081 2019
DpWpDynCreate: created new work process W12-8776

Wed Sep 18 04:55:12:606 2019


*** ERROR => DpHdlDeadWp: W1 (pid 8775) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8775) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 8775)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 8776) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8776) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 8776)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:55:31:076 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:55:51:076 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 04:56:11:076 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9164

Wed Sep 18 04:56:18:230 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:56:31:076 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9164 terminated

Wed Sep 18 04:56:51:077 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
Wed Sep 18 04:57:11:077 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:57:31:077 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:57:51:078 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:58:11:078 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:58:31:079 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:58:51:079 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:59:11:080 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:59:31:080 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 04:59:51:080 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 05:00:11:081 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-10684
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-10685

Wed Sep 18 05:00:12:630 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10684) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10684) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10684)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10685) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10685) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10685)
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 05:00:31:082 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 05:00:51:082 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 04:56:11 2019, skip new
snapshot

Wed Sep 18 05:01:11:082 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 124 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:01:11 2019) - begin **********
Server smprd02_SMP_00, Wed Sep 18 05:01:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:01:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10963562, readCount 10963562)


UPD : 0 (peak 31, writeCount 2373, readCount 2373)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080587, readCount 1080587)
SPO : 0 (peak 2, writeCount 12460, readCount 12460)
UP2 : 0 (peak 1, writeCount 1192, readCount 1192)
DISP: 0 (peak 67, writeCount 421852, readCount 421852)
GW : 0 (peak 45, writeCount 9996128, readCount 9996128)
ICM : 0 (peak 186, writeCount 199438, readCount 199438)
LWP : 0 (peak 15, writeCount 18838, readCount 18838)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:01:11 2019


------------------------------------------------------------

Current snapshot id: 124


DB clean time (in percent of total time) : 24.45 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |150|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |150|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:01:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |05:00:58|4 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 05:01:11 2019


------------------------------------------------------------
|No |Conv-Id |Fi-Key |Sess-Key |State |
Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 4|Wed Sep 18 05:00:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:01:11 2019


------------------------------------------------------------
Current pipes in use: 199
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:01:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2829| 47| |
|
| 1|DDLOG | 2829| 47| |
|
| 2|BTCSCHED | 5660| 49| |
|
| 3|RESTART_ALL | 1132| 225| |
|
| 4|ENVCHECK | 16983| 20| |
|
| 5|AUTOABAP | 1132| 225| |
|
| 6|BGRFC_WATCHDOG | 1133| 225| |
|
| 7|AUTOTH | 1741| 55| |
|
| 8|AUTOCCMS | 5660| 49| |
|
| 9|AUTOSECURITY | 5660| 49| |
|
| 10|LOAD_CALCULATION | 339180| 0| |
|
| 11|SPOOLALRM | 5661| 49| |
|
| 12|CALL_DELAYED | 0| 8423| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 124 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:01:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:01:31:083 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:01:51:083 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:02:11:083 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 15477

Wed Sep 18 05:02:18:271 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:02:31:084 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 15477 terminated

Wed Sep 18 05:02:51:084 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:03:11:085 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
Wed Sep 18 05:03:31:086 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:03:51:086 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:04:11:087 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:04:31:087 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:04:51:087 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:05:11:088 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-27935
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-27936

Wed Sep 18 05:05:12:608 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27935) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27935) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27935)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27936) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27936) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27936)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:05:31:088 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:05:51:088 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:06:11:089 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:06:31:089 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:06:51:089 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:02:11 2019, skip new
snapshot

Wed Sep 18 05:07:11:089 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 125 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:07:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:07:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:07:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10964632, readCount 10964632)


UPD : 0 (peak 31, writeCount 2374, readCount 2374)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080591, readCount 1080591)
SPO : 0 (peak 2, writeCount 12473, readCount 12473)
UP2 : 0 (peak 1, writeCount 1193, readCount 1193)
DISP: 0 (peak 67, writeCount 421892, readCount 421892)
GW : 0 (peak 45, writeCount 9996918, readCount 9996918)
ICM : 0 (peak 186, writeCount 199465, readCount 199465)
LWP : 0 (peak 15, writeCount 18853, readCount 18853)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:07:11 2019


------------------------------------------------------------

Current snapshot id: 125


DB clean time (in percent of total time) : 24.46 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |151|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |151|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:07:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |05:06:58|16 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 05:07:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 16|Wed Sep 18 05:06:58 2019 |

Found 1 RFC-Connections
CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:07:11 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:07:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2832| 47| |
|
| 1|DDLOG | 2832| 47| |
|
| 2|BTCSCHED | 5666| 49| |
|
| 3|RESTART_ALL | 1133| 165| |
|
| 4|ENVCHECK | 17001| 20| |
|
| 5|AUTOABAP | 1133| 165| |
|
| 6|BGRFC_WATCHDOG | 1134| 165| |
|
| 7|AUTOTH | 1747| 55| |
|
| 8|AUTOCCMS | 5666| 49| |
|
| 9|AUTOSECURITY | 5666| 49| |
|
| 10|LOAD_CALCULATION | 339539| 0| |
|
| 11|SPOOLALRM | 5667| 49| |
|
| 12|CALL_DELAYED | 0| 8063| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 125 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:07:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:07:31:090 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:07:51:091 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:08:11:092 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 8698

Wed Sep 18 05:08:18:646 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:08:31:092 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 8698 terminated

Wed Sep 18 05:08:51:093 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:09:11:093 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:09:31:094 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:09:51:095 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:10:11:095 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-9567
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-9568

Wed Sep 18 05:10:12:811 2019


*** ERROR => DpHdlDeadWp: W1 (pid 9567) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9567) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 9567)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 9568) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9568) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 9568)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:10:31:096 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:10:51:097 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:11:11:097 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:11:31:098 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:11:51:098 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:12:11:099 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:12:31:099 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:12:51:101 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:08:11 2019, skip new
snapshot

Wed Sep 18 05:13:11:101 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 126 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:13:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:13:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:13:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10965520, readCount 10965520)


UPD : 0 (peak 31, writeCount 2376, readCount 2376)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080599, readCount 1080599)
SPO : 0 (peak 2, writeCount 12487, readCount 12487)
UP2 : 0 (peak 1, writeCount 1195, readCount 1195)
DISP: 0 (peak 67, writeCount 421933, readCount 421933)
GW : 0 (peak 45, writeCount 9997536, readCount 9997536)
ICM : 0 (peak 186, writeCount 199496, readCount 199496)
LWP : 2 (peak 15, writeCount 18883, readCount 18881)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:13:11 2019


------------------------------------------------------------
Current snapshot id: 126
DB clean time (in percent of total time) : 24.47 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |152|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |152|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:13:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T54_U9861_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |05:12:58|3 |
SAPMSSY1 |norm| |
| | 4246|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 05:13:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 150|43342091|43342091SU9861_M0 |T54_U9861_M0_I0 |ALLOCATED |
SERVER|RECEIVE | 3|Wed Sep 18 05:12:58 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Wed Sep 18 05:13:11 2019
------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:13:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2835| 47| |
|
| 1|DDLOG | 2835| 47| |
|
| 2|BTCSCHED | 5672| 49| |
|
| 3|RESTART_ALL | 1134| 105| |
|
| 4|ENVCHECK | 17019| 20| |
|
| 5|AUTOABAP | 1134| 105| |
|
| 6|BGRFC_WATCHDOG | 1135| 105| |
|
| 7|AUTOTH | 1753| 55| |
|
| 8|AUTOCCMS | 5672| 49| |
|
| 9|AUTOSECURITY | 5672| 49| |
|
| 10|LOAD_CALCULATION | 339898| 0| |
|
| 11|SPOOLALRM | 5673| 49| |
|
| 12|CALL_DELAYED | 0| 7703| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 126 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:13:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:13:31:101 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:13:51:102 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:14:11:102 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 10703

Wed Sep 18 05:14:18:975 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:14:31:102 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 10703 terminated

Wed Sep 18 05:14:51:103 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:15:11:103 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-11637
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-11638

Wed Sep 18 05:15:12:085 2019


*** ERROR => DpHdlDeadWp: W1 (pid 11637) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11637) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 11637)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 11638) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11638) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 11638)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:15:31:104 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:15:51:105 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:16:11:105 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:16:31:105 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:16:51:105 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:17:11:106 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:17:31:106 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:17:51:107 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:18:11:107 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:18:31:108 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:18:51:108 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:14:11 2019, skip new
snapshot

Wed Sep 18 05:19:11:109 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 127 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:19:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:19:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:19:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10966380, readCount 10966380)


UPD : 0 (peak 31, writeCount 2377, readCount 2377)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080603, readCount 1080603)
SPO : 0 (peak 2, writeCount 12500, readCount 12500)
UP2 : 0 (peak 1, writeCount 1196, readCount 1196)
DISP: 0 (peak 67, writeCount 421974, readCount 421974)
GW : 0 (peak 45, writeCount 9998108, readCount 9998108)
ICM : 1 (peak 186, writeCount 199501, readCount 199500)
LWP : 2 (peak 15, writeCount 18898, readCount 18896)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:19:11 2019


------------------------------------------------------------

Current snapshot id: 127


DB clean time (in percent of total time) : 24.47 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |153|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |153|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:19:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 05:19:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:19:11 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:19:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2838| 47| |
|
| 1|DDLOG | 2838| 47| |
|
| 2|BTCSCHED | 5678| 49| |
|
| 3|RESTART_ALL | 1135| 45| |
|
| 4|ENVCHECK | 17037| 20| |
|
| 5|AUTOABAP | 1135| 45| |
|
| 6|BGRFC_WATCHDOG | 1136| 45| |
|
| 7|AUTOTH | 1759| 55| |
|
| 8|AUTOCCMS | 5678| 49| |
|
| 9|AUTOSECURITY | 5678| 49| |
|
| 10|LOAD_CALCULATION | 340256| 0| |
|
| 11|SPOOLALRM | 5679| 49| |
|
| 12|CALL_DELAYED | 0| 7343| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 127 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:19:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:19:31:110 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:19:51:110 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:20:11:111 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-13027
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-13028
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 13029

Wed Sep 18 05:20:12:377 2019


*** ERROR => DpHdlDeadWp: W1 (pid 13027) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13027) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 13027)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 13028) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13028) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13028)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:20:18:351 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:20:31:111 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 13029 terminated

Wed Sep 18 05:20:51:112 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:21:11:113 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:21:31:113 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:21:51:114 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
Wed Sep 18 05:22:11:115 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:22:31:115 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:22:51:116 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:23:11:116 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:23:31:117 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:23:51:117 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:24:11:118 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:24:31:118 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:24:51:119 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:20:11 2019, skip new
snapshot

Wed Sep 18 05:25:11:119 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 128 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:25:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:25:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:25:11 2019


------------------------------------------------------------
Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10967218, readCount 10967218)


UPD : 0 (peak 31, writeCount 2378, readCount 2378)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080607, readCount 1080607)
SPO : 0 (peak 2, writeCount 12513, readCount 12513)
UP2 : 0 (peak 1, writeCount 1197, readCount 1197)
DISP: 0 (peak 67, writeCount 422019, readCount 422019)
GW : 0 (peak 45, writeCount 9998661, readCount 9998661)
ICM : 0 (peak 186, writeCount 199504, readCount 199504)
LWP : 2 (peak 15, writeCount 18913, readCount 18911)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:25:11 2019


------------------------------------------------------------

Current snapshot id: 128


DB clean time (in percent of total time) : 24.48 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |154|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |154|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:25:11 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 05:25:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:25:11 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:25:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2841| 47| |
|
| 1|DDLOG | 2841| 47| |
|
| 2|BTCSCHED | 5684| 49| |
|
| 3|RESTART_ALL | 1137| 285| |
|
| 4|ENVCHECK | 17055| 20| |
|
| 5|AUTOABAP | 1137| 285| |
|
| 6|BGRFC_WATCHDOG | 1138| 285| |
|
| 7|AUTOTH | 1765| 55| |
|
| 8|AUTOCCMS | 5684| 49| |
|
| 9|AUTOSECURITY | 5684| 49| |
|
| 10|LOAD_CALCULATION | 340615| 0| |
|
| 11|SPOOLALRM | 5685| 49| |
|
| 12|CALL_DELAYED | 0| 6983| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 128 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:25:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-14826
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:25:11:126 2019


DpWpDynCreate: created new work process W12-14827

Wed Sep 18 05:25:11:847 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14826) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14826) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14826)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14827) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14827) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14827)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:25:31:120 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:25:51:120 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:26:11:120 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 15088

Wed Sep 18 05:26:19:141 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
Wed Sep 18 05:26:31:121 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 15088 terminated

Wed Sep 18 05:26:51:121 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:27:11:121 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:27:31:121 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:27:51:122 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:28:11:123 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:28:31:124 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:28:51:125 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:29:11:125 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:29:31:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:29:51:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:30:11:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-16535
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-16536

Wed Sep 18 05:30:11:903 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16535) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16535) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16535)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16536) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16536) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16536)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:30:31:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:30:51:126 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:26:11 2019, skip new
snapshot

Wed Sep 18 05:31:11:127 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 129 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:31:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:31:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:31:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2


Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10968018, readCount 10968018)


UPD : 0 (peak 31, writeCount 2379, readCount 2379)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080611, readCount 1080611)
SPO : 0 (peak 2, writeCount 12526, readCount 12526)
UP2 : 0 (peak 1, writeCount 1198, readCount 1198)
DISP: 0 (peak 67, writeCount 422060, readCount 422060)
GW : 1 (peak 45, writeCount 9999191, readCount 9999190)
ICM : 0 (peak 186, writeCount 199507, readCount 199507)
LWP : 0 (peak 15, writeCount 18928, readCount 18928)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:31:11 2019


------------------------------------------------------------

Current snapshot id: 129


DB clean time (in percent of total time) : 24.48 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |156|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |156|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:31:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 05:31:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:31:11 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:31:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2844| 47| |
|
| 1|DDLOG | 2844| 47| |
|
| 2|BTCSCHED | 5690| 49| |
|
| 3|RESTART_ALL | 1138| 225| |
|
| 4|ENVCHECK | 17073| 20| |
|
| 5|AUTOABAP | 1138| 225| |
|
| 6|BGRFC_WATCHDOG | 1139| 225| |
|
| 7|AUTOTH | 1771| 55| |
|
| 8|AUTOCCMS | 5690| 49| |
|
| 9|AUTOSECURITY | 5690| 49| |
|
| 10|LOAD_CALCULATION | 340974| 0| |
|
| 11|SPOOLALRM | 5691| 49| |
|
| 12|CALL_DELAYED | 0| 6623| |
|
Found 13 periodic tasks

********** SERVER SNAPSHOT 129 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:31:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:31:31:128 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:31:51:128 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:32:11:129 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 17059

Wed Sep 18 05:32:18:754 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:32:31:130 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 17059 terminated

Wed Sep 18 05:32:51:131 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:33:11:132 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:33:31:132 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:33:51:133 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:34:11:133 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:34:31:133 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:34:51:134 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:35:11:135 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-18366
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-18367

Wed Sep 18 05:35:11:957 2019


*** ERROR => DpHdlDeadWp: W1 (pid 18366) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18366) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 18366)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 18367) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18367) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18367)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:35:31:136 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:35:51:137 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:36:11:137 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:36:31:138 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:36:51:138 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:32:11 2019, skip new
snapshot

Wed Sep 18 05:37:11:139 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 130 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:37:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:37:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:37:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10968836, readCount 10968836)


UPD : 0 (peak 31, writeCount 2380, readCount 2380)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080615, readCount 1080615)
SPO : 0 (peak 2, writeCount 12539, readCount 12539)
UP2 : 0 (peak 1, writeCount 1199, readCount 1199)
DISP: 0 (peak 67, writeCount 422101, readCount 422101)
GW : 1 (peak 45, writeCount 9999715, readCount 9999714)
ICM : 0 (peak 186, writeCount 199510, readCount 199510)
LWP : 0 (peak 15, writeCount 18943, readCount 18943)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 273 (rq_id 29436107, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:37:11 2019


------------------------------------------------------------

Current snapshot id: 130


DB clean time (in percent of total time) : 24.48 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |157|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |157|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:37:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 05:37:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Wed Sep 18 05:37:11 2019
------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:37:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2847| 47| |
|
| 1|DDLOG | 2847| 47| |
|
| 2|BTCSCHED | 5696| 49| |
|
| 3|RESTART_ALL | 1139| 165| |
|
| 4|ENVCHECK | 17091| 20| |
|
| 5|AUTOABAP | 1139| 165| |
|
| 6|BGRFC_WATCHDOG | 1140| 165| |
|
| 7|AUTOTH | 1777| 55| |
|
| 8|AUTOCCMS | 5696| 49| |
|
| 9|AUTOSECURITY | 5696| 49| |
|
| 10|LOAD_CALCULATION | 341334| 1| |
|
| 11|SPOOLALRM | 5697| 49| |
|
| 12|CALL_DELAYED | 0| 6263| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 130 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:37:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:37:31:140 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:37:51:140 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:38:11:140 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 19175

Wed Sep 18 05:38:18:891 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:38:31:141 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 19175 terminated

Wed Sep 18 05:38:51:142 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:39:11:142 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:39:31:142 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:39:51:143 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:40:11:144 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-20046
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-20047

Wed Sep 18 05:40:12:014 2019


*** ERROR => DpHdlDeadWp: W1 (pid 20046) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20046) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 20046)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 20047) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20047) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 20047)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:40:31:144 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:40:51:145 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:41:11:145 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:41:31:146 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:41:51:146 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:42:11:147 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:42:31:148 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:42:51:148 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:38:11 2019, skip new
snapshot

Wed Sep 18 05:43:11:149 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 131 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:43:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:43:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:43:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10969672, readCount 10969672)


UPD : 0 (peak 31, writeCount 2382, readCount 2382)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080623, readCount 1080623)
SPO : 0 (peak 2, writeCount 12553, readCount 12553)
UP2 : 0 (peak 1, writeCount 1201, readCount 1201)
DISP: 0 (peak 67, writeCount 422142, readCount 422142)
GW : 0 (peak 45, writeCount 10000257, readCount 10000257)
ICM : 0 (peak 186, writeCount 199515, readCount 199515)
LWP : 2 (peak 15, writeCount 18973, readCount 18971)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:43:11 2019


------------------------------------------------------------

Current snapshot id: 131


DB clean time (in percent of total time) : 24.49 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |158|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |158|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:43:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 05:43:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:43:11 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:43:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2850| 47| |
|
| 1|DDLOG | 2850| 47| |
|
| 2|BTCSCHED | 5702| 49| |
|
| 3|RESTART_ALL | 1140| 105| |
|
| 4|ENVCHECK | 17109| 20| |
|
| 5|AUTOABAP | 1140| 105| |
|
| 6|BGRFC_WATCHDOG | 1141| 105| |
|
| 7|AUTOTH | 1783| 55| |
|
| 8|AUTOCCMS | 5702| 49| |
|
| 9|AUTOSECURITY | 5702| 49| |
|
| 10|LOAD_CALCULATION | 341692| 0| |
|
| 11|SPOOLALRM | 5703| 49| |
|
| 12|CALL_DELAYED | 0| 5903| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 131 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:43:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:43:31:149 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:43:51:149 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:44:11:150 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 21892

Wed Sep 18 05:44:18:948 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:44:31:151 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 21892 terminated

Wed Sep 18 05:44:51:152 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:45:11:153 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-22459
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-22460

Wed Sep 18 05:45:12:064 2019


*** ERROR => DpHdlDeadWp: W1 (pid 22459) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22459) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 22459)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 22460) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22460) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 22460)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:45:31:154 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:45:51:154 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
Wed Sep 18 05:46:11:154 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:46:31:155 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:46:51:156 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:47:11:156 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:47:31:157 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:47:51:157 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:48:11:157 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:48:31:158 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:48:51:159 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:44:11 2019, skip new
snapshot

Wed Sep 18 05:49:11:160 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 132 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:49:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 05:49:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:49:11 2019


------------------------------------------------------------
Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10970491, readCount 10970491)


UPD : 0 (peak 31, writeCount 2383, readCount 2383)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080627, readCount 1080627)
SPO : 0 (peak 2, writeCount 12566, readCount 12566)
UP2 : 0 (peak 1, writeCount 1202, readCount 1202)
DISP: 0 (peak 67, writeCount 422182, readCount 422182)
GW : 0 (peak 45, writeCount 10000797, readCount 10000797)
ICM : 0 (peak 186, writeCount 199518, readCount 199518)
LWP : 2 (peak 15, writeCount 18988, readCount 18986)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:49:11 2019


------------------------------------------------------------

Current snapshot id: 132


DB clean time (in percent of total time) : 24.49 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |159|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |159|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:49:11 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 05:49:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:49:11 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:49:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2853| 47| |
|
| 1|DDLOG | 2853| 47| |
|
| 2|BTCSCHED | 5708| 49| |
|
| 3|RESTART_ALL | 1141| 45| |
|
| 4|ENVCHECK | 17127| 20| |
|
| 5|AUTOABAP | 1141| 45| |
|
| 6|BGRFC_WATCHDOG | 1142| 45| |
|
| 7|AUTOTH | 1789| 55| |
|
| 8|AUTOCCMS | 5708| 49| |
|
| 9|AUTOSECURITY | 5708| 49| |
|
| 10|LOAD_CALCULATION | 342051| 0| |
|
| 11|SPOOLALRM | 5709| 49| |
|
| 12|CALL_DELAYED | 0| 5543| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 132 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:49:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:49:31:160 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:49:51:161 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:50:11:161 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-23840
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-23841
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23842

Wed Sep 18 05:50:12:119 2019


*** ERROR => DpHdlDeadWp: W1 (pid 23840) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23840) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 23840)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 23841) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23841) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 23841)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:50:19:432 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
Wed Sep 18 05:50:31:162 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 23842 terminated

Wed Sep 18 05:50:51:163 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:51:11:164 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:51:31:164 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:51:51:164 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:52:11:164 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:52:31:165 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:52:51:166 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:53:11:167 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:53:31:167 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:53:51:167 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:54:11:167 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:54:31:168 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:54:51:169 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:50:11 2019, skip new
snapshot

Wed Sep 18 05:55:11:169 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 133 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:55:11 2019) - begin **********
Server smprd02_SMP_00, Wed Sep 18 05:55:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 05:55:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10971288, readCount 10971288)


UPD : 0 (peak 31, writeCount 2384, readCount 2384)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080631, readCount 1080631)
SPO : 0 (peak 2, writeCount 12579, readCount 12579)
UP2 : 0 (peak 1, writeCount 1203, readCount 1203)
DISP: 0 (peak 67, writeCount 422227, readCount 422227)
GW : 0 (peak 45, writeCount 10001326, readCount 10001326)
ICM : 0 (peak 186, writeCount 199521, readCount 199521)
LWP : 2 (peak 15, writeCount 19003, readCount 19001)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 05:55:11 2019


------------------------------------------------------------

Current snapshot id: 133


DB clean time (in percent of total time) : 24.49 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |160|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |160|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 05:55:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 05:55:11 2019


CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 05:55:11 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 05:55:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2856| 47| |
|
| 1|DDLOG | 2856| 47| |
|
| 2|BTCSCHED | 5714| 49| |
|
| 3|RESTART_ALL | 1143| 285| |
|
| 4|ENVCHECK | 17145| 20| |
|
| 5|AUTOABAP | 1143| 285| |
|
| 6|BGRFC_WATCHDOG | 1144| 285| |
|
| 7|AUTOTH | 1795| 55| |
|
| 8|AUTOCCMS | 5714| 49| |
|
| 9|AUTOSECURITY | 5714| 49| |
|
| 10|LOAD_CALCULATION | 342411| 0| |
|
| 11|SPOOLALRM | 5715| 49| |
|
| 12|CALL_DELAYED | 0| 5183| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 133 (Reason: Workprocess 1 died / Time: Wed Sep 18
05:55:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-26058
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:55:11:175 2019


DpWpDynCreate: created new work process W12-26059
Wed Sep 18 05:55:12:175 2019
*** ERROR => DpHdlDeadWp: W1 (pid 26058) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26058) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 26058)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 26059) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26059) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 26059)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:55:31:169 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:55:51:170 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 05:56:11:170 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 26246

Wed Sep 18 05:56:19:247 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:56:31:171 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 26246 terminated

Wed Sep 18 05:56:51:172 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:57:11:173 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:57:31:173 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:57:51:174 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:58:11:174 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:58:31:174 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:58:51:175 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:59:11:175 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:59:31:175 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 05:59:51:176 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 06:00:11:176 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-27703
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-27704

Wed Sep 18 06:00:12:224 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27703) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27703) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27703)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27704) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27704) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27704)
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 06:00:31:176 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 06:00:51:177 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 05:56:11 2019, skip new
snapshot

Wed Sep 18 06:01:11:178 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 134 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:01:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:01:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:01:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10972296, readCount 10972296)


UPD : 0 (peak 31, writeCount 2385, readCount 2385)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080635, readCount 1080635)
SPO : 0 (peak 2, writeCount 12592, readCount 12592)
UP2 : 0 (peak 1, writeCount 1204, readCount 1204)
DISP: 0 (peak 67, writeCount 422268, readCount 422268)
GW : 0 (peak 45, writeCount 10002022, readCount 10002022)
ICM : 0 (peak 186, writeCount 199524, readCount 199524)
LWP : 0 (peak 15, writeCount 19018, readCount 19018)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:01:11 2019


------------------------------------------------------------

Current snapshot id: 134


DB clean time (in percent of total time) : 24.49 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |162|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |162|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:01:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
|SYNC_RFC |T140_U10766_M0 | | |smprd02.niladv.org |06:01:11| |
|norm|1 | | |
0|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

RFC-Connection Table (1 entries) Wed Sep 18 06:01:11 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 25|12072178|12072178SU10766_M0 |T140_U10766_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
333 INVALID -1
1 ca_blk slots of 6000 in use, 1 currently unowned (in request queues)

MPI Info Wed Sep 18 06:01:11 2019


------------------------------------------------------------
Current pipes in use: 221
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:01:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2859| 47| |
|
| 1|DDLOG | 2859| 47| |
|
| 2|BTCSCHED | 5720| 49| |
|
| 3|RESTART_ALL | 1144| 225| |
|
| 4|ENVCHECK | 17163| 20| |
|
| 5|AUTOABAP | 1144| 225| |
|
| 6|BGRFC_WATCHDOG | 1145| 225| |
|
| 7|AUTOTH | 1801| 55| |
|
| 8|AUTOCCMS | 5720| 49| |
|
| 9|AUTOSECURITY | 5720| 49| |
|
| 10|LOAD_CALCULATION | 342770| 0| |
|
| 11|SPOOLALRM | 5721| 49| |
|
| 12|CALL_DELAYED | 0| 4823| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 134 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:01:11 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:01:31:179 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:01:51:179 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:02:11:180 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 458

Wed Sep 18 06:02:19:440 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:02:31:180 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 458 terminated

Wed Sep 18 06:02:51:181 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:03:11:181 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:03:31:182 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:03:51:182 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:04:11:183 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:04:31:184 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:04:51:185 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:05:11:185 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-13073
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-13074

Wed Sep 18 06:05:12:283 2019


*** ERROR => DpHdlDeadWp: W1 (pid 13073) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13073) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 13073)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 13074) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13074) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13074)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:05:31:185 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:05:51:186 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:06:11:186 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:06:31:186 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:06:51:187 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:02:11 2019, skip new
snapshot

Wed Sep 18 06:07:11:187 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 135 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:07:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:07:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:07:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10973365, readCount 10973365)


UPD : 0 (peak 31, writeCount 2386, readCount 2386)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080639, readCount 1080639)
SPO : 0 (peak 2, writeCount 12605, readCount 12605)
UP2 : 0 (peak 1, writeCount 1205, readCount 1205)
DISP: 0 (peak 67, writeCount 422309, readCount 422309)
GW : 0 (peak 45, writeCount 10002778, readCount 10002778)
ICM : 0 (peak 186, writeCount 199527, readCount 199527)
LWP : 0 (peak 15, writeCount 19033, readCount 19033)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests
Workprocess Table (long) Wed Sep 18 06:07:11 2019
------------------------------------------------------------

Current snapshot id: 135


DB clean time (in percent of total time) : 24.50 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |163|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |163|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:07:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:07:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:07:11 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:07:11 2019


------------------------------------------------------------
|Handle |Type |Calls |Wait(sec) |Session |Resp-ID
|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2862| 47| |
|
| 1|DDLOG | 2862| 47| |
|
| 2|BTCSCHED | 5726| 49| |
|
| 3|RESTART_ALL | 1145| 165| |
|
| 4|ENVCHECK | 17181| 20| |
|
| 5|AUTOABAP | 1145| 165| |
|
| 6|BGRFC_WATCHDOG | 1146| 165| |
|
| 7|AUTOTH | 1807| 55| |
|
| 8|AUTOCCMS | 5726| 49| |
|
| 9|AUTOSECURITY | 5726| 49| |
|
| 10|LOAD_CALCULATION | 343129| 0| |
|
| 11|SPOOLALRM | 5727| 49| |
|
| 12|CALL_DELAYED | 0| 4463| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 135 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:07:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:07:31:188 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:07:51:188 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:08:11:188 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 25830

Wed Sep 18 06:08:19:628 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:08:31:189 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 25830 terminated

Wed Sep 18 06:08:51:189 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:09:11:190 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:09:31:190 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:09:51:191 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:10:11:192 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-26785
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-26786

Wed Sep 18 06:10:12:333 2019


*** ERROR => DpHdlDeadWp: W1 (pid 26785) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26785) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 26785)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 26786) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26786) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 26786)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:10:31:192 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:10:51:193 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:11:11:194 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:11:31:195 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:11:51:195 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:12:11:195 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:12:31:196 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:12:51:197 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:08:11 2019, skip new
snapshot

Wed Sep 18 06:13:11:197 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 136 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:13:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:13:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:13:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10974235, readCount 10974235)


UPD : 0 (peak 31, writeCount 2388, readCount 2388)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080647, readCount 1080647)
SPO : 0 (peak 2, writeCount 12619, readCount 12619)
UP2 : 0 (peak 1, writeCount 1207, readCount 1207)
DISP: 0 (peak 67, writeCount 422350, readCount 422350)
GW : 0 (peak 45, writeCount 10003354, readCount 10003354)
ICM : 1 (peak 186, writeCount 199534, readCount 199533)
LWP : 2 (peak 15, writeCount 19063, readCount 19061)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:13:11 2019


------------------------------------------------------------

Current snapshot id: 136


DB clean time (in percent of total time) : 24.50 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |164|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |164|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:13:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:13:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:13:11 2019


------------------------------------------------------------
Current pipes in use: 225
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:13:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2865| 47| |
|
| 1|DDLOG | 2865| 47| |
|
| 2|BTCSCHED | 5732| 49| |
|
| 3|RESTART_ALL | 1146| 105| |
|
| 4|ENVCHECK | 17199| 20| |
|
| 5|AUTOABAP | 1146| 105| |
|
| 6|BGRFC_WATCHDOG | 1147| 105| |
|
| 7|AUTOTH | 1813| 55| |
|
| 8|AUTOCCMS | 5732| 49| |
|
| 9|AUTOSECURITY | 5732| 49| |
|
| 10|LOAD_CALCULATION | 343488| 0| |
|
| 11|SPOOLALRM | 5733| 49| |
|
| 12|CALL_DELAYED | 0| 4103| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 136 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:13:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:13:31:197 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:13:51:197 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:14:11:198 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 27972

Wed Sep 18 06:14:19:725 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:14:31:199 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 27972 terminated

Wed Sep 18 06:14:51:199 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:15:11:200 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-28604
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-28605

Wed Sep 18 06:15:12:389 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28604) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28604) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28604)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 28605) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28605) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28605)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:15:31:201 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:15:51:201 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:16:11:202 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
Wed Sep 18 06:16:31:202 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:16:51:203 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:17:11:203 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:17:31:203 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:17:51:204 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:18:11:205 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:18:31:205 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:18:51:205 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:14:11 2019, skip new
snapshot

Wed Sep 18 06:19:11:206 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 137 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:19:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:19:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:19:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10975146, readCount 10975146)


UPD : 0 (peak 31, writeCount 2389, readCount 2389)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080651, readCount 1080651)
SPO : 0 (peak 2, writeCount 12632, readCount 12632)
UP2 : 0 (peak 1, writeCount 1208, readCount 1208)
DISP: 0 (peak 67, writeCount 422391, readCount 422391)
GW : 0 (peak 45, writeCount 10003976, readCount 10003976)
ICM : 0 (peak 186, writeCount 199537, readCount 199537)
LWP : 2 (peak 15, writeCount 19078, readCount 19076)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:19:11 2019


------------------------------------------------------------

Current snapshot id: 137


DB clean time (in percent of total time) : 24.50 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |165|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |165|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:19:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:19:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:19:11 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:19:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2868| 47| |
|
| 1|DDLOG | 2868| 47| |
|
| 2|BTCSCHED | 5738| 49| |
|
| 3|RESTART_ALL | 1147| 45| |
|
| 4|ENVCHECK | 17217| 20| |
|
| 5|AUTOABAP | 1147| 45| |
|
| 6|BGRFC_WATCHDOG | 1148| 45| |
|
| 7|AUTOTH | 1819| 55| |
|
| 8|AUTOCCMS | 5738| 49| |
|
| 9|AUTOSECURITY | 5738| 49| |
|
| 10|LOAD_CALCULATION | 343847| 0| |
|
| 11|SPOOLALRM | 5739| 49| |
|
| 12|CALL_DELAYED | 0| 3743| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 137 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:19:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:19:31:206 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:19:51:206 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:20:11:207 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-30007
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-30008
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 30009

Wed Sep 18 06:20:12:440 2019


*** ERROR => DpHdlDeadWp: W1 (pid 30007) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30007) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 30007)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 30008) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30008) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 30008)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:20:19:306 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:20:31:207 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 30009 terminated

Wed Sep 18 06:20:51:207 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:21:11:208 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:21:31:208 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:21:51:208 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:22:11:209 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:22:31:209 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:22:51:209 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:23:11:210 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:23:31:210 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:23:51:211 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:24:11:212 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:24:31:212 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:24:51:212 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:20:11 2019, skip new
snapshot

Wed Sep 18 06:25:11:212 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 138 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:25:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:25:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:25:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10975994, readCount 10975994)


UPD : 0 (peak 31, writeCount 2390, readCount 2390)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080655, readCount 1080655)
SPO : 0 (peak 2, writeCount 12645, readCount 12645)
UP2 : 0 (peak 1, writeCount 1209, readCount 1209)
DISP: 0 (peak 67, writeCount 422436, readCount 422436)
GW : 0 (peak 45, writeCount 10004531, readCount 10004531)
ICM : 0 (peak 186, writeCount 199540, readCount 199540)
LWP : 2 (peak 15, writeCount 19093, readCount 19091)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:25:11 2019


------------------------------------------------------------

Current snapshot id: 138


DB clean time (in percent of total time) : 24.51 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |166|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |166|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:25:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:25:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:25:11 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:25:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2871| 47| |
|
| 1|DDLOG | 2871| 47| |
|
| 2|BTCSCHED | 5744| 49| |
|
| 3|RESTART_ALL | 1149| 285| |
|
| 4|ENVCHECK | 17235| 20| |
|
| 5|AUTOABAP | 1149| 285| |
|
| 6|BGRFC_WATCHDOG | 1150| 285| |
|
| 7|AUTOTH | 1825| 55| |
|
| 8|AUTOCCMS | 5744| 49| |
|
| 9|AUTOSECURITY | 5744| 49| |
|
| 10|LOAD_CALCULATION | 344205| 0| |
|
| 11|SPOOLALRM | 5745| 49| |
|
| 12|CALL_DELAYED | 0| 3383| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 138 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:25:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-31754
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:25:11:218 2019


DpWpDynCreate: created new work process W12-31755

Wed Sep 18 06:25:12:496 2019


*** ERROR => DpHdlDeadWp: W1 (pid 31754) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31754) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 31754)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 31755) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31755) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 31755)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:25:31:212 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:25:51:213 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:26:11:214 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 32018

Wed Sep 18 06:26:19:735 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:26:31:215 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 32018 terminated

Wed Sep 18 06:26:51:215 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:27:11:216 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
Wed Sep 18 06:27:31:216 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:27:51:217 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:28:11:217 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:28:31:218 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:28:51:219 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:29:11:219 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:29:31:219 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:29:51:220 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:30:11:220 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-1258
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-1259

Wed Sep 18 06:30:12:546 2019


*** ERROR => DpHdlDeadWp: W1 (pid 1258) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1258) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 1258)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 1259) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1259) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 1259)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:30:31:222 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:30:51:222 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:26:11 2019, skip new
snapshot

Wed Sep 18 06:31:11:222 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 139 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:31:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:31:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:31:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10976798, readCount 10976798)


UPD : 0 (peak 31, writeCount 2391, readCount 2391)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080659, readCount 1080659)
SPO : 0 (peak 2, writeCount 12658, readCount 12658)
UP2 : 0 (peak 1, writeCount 1210, readCount 1210)
DISP: 0 (peak 67, writeCount 422476, readCount 422476)
GW : 0 (peak 45, writeCount 10005061, readCount 10005061)
ICM : 1 (peak 186, writeCount 199543, readCount 199542)
LWP : 0 (peak 15, writeCount 19108, readCount 19108)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:31:11 2019


------------------------------------------------------------

Current snapshot id: 139


DB clean time (in percent of total time) : 24.51 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |168|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |168|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:31:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:31:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:31:11 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:31:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2874| 47| |
|
| 1|DDLOG | 2874| 47| |
|
| 2|BTCSCHED | 5750| 49| |
|
| 3|RESTART_ALL | 1150| 225| |
|
| 4|ENVCHECK | 17253| 20| |
|
| 5|AUTOABAP | 1150| 225| |
|
| 6|BGRFC_WATCHDOG | 1151| 225| |
|
| 7|AUTOTH | 1831| 55| |
|
| 8|AUTOCCMS | 5750| 49| |
|
| 9|AUTOSECURITY | 5750| 49| |
|
| 10|LOAD_CALCULATION | 344564| 0| |
|
| 11|SPOOLALRM | 5751| 49| |
|
| 12|CALL_DELAYED | 0| 3023| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 139 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:31:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:31:31:222 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:31:51:223 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:32:11:224 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 1903

Wed Sep 18 06:32:19:738 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:32:31:225 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 1903 terminated

Wed Sep 18 06:32:51:225 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:33:11:226 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:33:31:226 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:33:51:227 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:34:11:227 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:34:31:228 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:34:51:229 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:35:11:229 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-3251
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-3252

Wed Sep 18 06:35:12:601 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3251) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3251) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3251)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3252) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3252) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 3252)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:35:31:230 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:35:51:230 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:36:11:231 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:36:31:231 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:36:51:231 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:32:11 2019, skip new
snapshot

Wed Sep 18 06:37:11:232 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 140 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:37:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:37:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:37:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10977637, readCount 10977637)


UPD : 0 (peak 31, writeCount 2392, readCount 2392)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080663, readCount 1080663)
SPO : 0 (peak 2, writeCount 12671, readCount 12671)
UP2 : 0 (peak 1, writeCount 1211, readCount 1211)
DISP: 0 (peak 67, writeCount 422517, readCount 422517)
GW : 0 (peak 45, writeCount 10005587, readCount 10005587)
ICM : 1 (peak 186, writeCount 199546, readCount 199545)
LWP : 0 (peak 15, writeCount 19123, readCount 19123)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:37:11 2019


------------------------------------------------------------

Current snapshot id: 140


DB clean time (in percent of total time) : 24.51 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |169|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |169|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:37:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:37:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:37:11 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:37:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2877| 47| |
|
| 1|DDLOG | 2877| 47| |
|
| 2|BTCSCHED | 5756| 49| |
|
| 3|RESTART_ALL | 1151| 165| |
|
| 4|ENVCHECK | 17271| 20| |
|
| 5|AUTOABAP | 1151| 165| |
|
| 6|BGRFC_WATCHDOG | 1152| 165| |
|
| 7|AUTOTH | 1837| 55| |
|
| 8|AUTOCCMS | 5756| 49| |
|
| 9|AUTOSECURITY | 5756| 49| |
|
| 10|LOAD_CALCULATION | 344924| 0| |
|
| 11|SPOOLALRM | 5757| 49| |
|
| 12|CALL_DELAYED | 0| 2663| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 140 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:37:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:37:31:232 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:37:51:232 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:38:11:233 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 4106

Wed Sep 18 06:38:19:624 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:38:31:234 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 4106 terminated
Wed Sep 18 06:38:51:234 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:39:11:235 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:39:31:236 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:39:51:236 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:40:11:237 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-5191
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-5192

Wed Sep 18 06:40:12:652 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5191) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5191) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5191)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5192) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5192) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5192)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:40:31:237 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:40:51:239 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:41:11:239 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:41:31:239 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:41:51:240 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:42:11:241 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:42:31:241 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:42:51:243 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:38:11 2019, skip new
snapshot

Wed Sep 18 06:43:11:243 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 141 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:43:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:43:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:43:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10978460, readCount 10978460)


UPD : 0 (peak 31, writeCount 2394, readCount 2394)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080671, readCount 1080671)
SPO : 0 (peak 2, writeCount 12685, readCount 12685)
UP2 : 0 (peak 1, writeCount 1213, readCount 1213)
DISP: 0 (peak 67, writeCount 422558, readCount 422558)
GW : 0 (peak 45, writeCount 10006115, readCount 10006115)
ICM : 0 (peak 186, writeCount 199551, readCount 199551)
LWP : 2 (peak 15, writeCount 19153, readCount 19151)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:43:11 2019


------------------------------------------------------------

Current snapshot id: 141


DB clean time (in percent of total time) : 24.51 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |170|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |170|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:43:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:43:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:43:11 2019


------------------------------------------------------------
Current pipes in use: 223
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:43:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2880| 47| |
|
| 1|DDLOG | 2880| 47| |
|
| 2|BTCSCHED | 5762| 49| |
|
| 3|RESTART_ALL | 1152| 105| |
|
| 4|ENVCHECK | 17289| 20| |
|
| 5|AUTOABAP | 1152| 105| |
|
| 6|BGRFC_WATCHDOG | 1153| 105| |
|
| 7|AUTOTH | 1843| 55| |
|
| 8|AUTOCCMS | 5762| 49| |
|
| 9|AUTOSECURITY | 5762| 49| |
|
| 10|LOAD_CALCULATION | 345283| 0| |
|
| 11|SPOOLALRM | 5763| 49| |
|
| 12|CALL_DELAYED | 0| 2303| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 141 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:43:11 2019) - end **********
***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:43:31:243 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:43:51:244 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:44:11:244 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 6339

Wed Sep 18 06:44:19:942 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:44:31:245 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 6339 terminated

Wed Sep 18 06:44:51:245 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:45:11:246 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-7050
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-7051
Wed Sep 18 06:45:12:694 2019
*** ERROR => DpHdlDeadWp: W1 (pid 7050) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7050) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7050)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7051) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7051) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7051)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:45:31:247 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:45:51:247 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:46:11:247 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:46:31:248 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:46:51:249 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:47:11:250 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:47:31:250 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:47:51:250 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:48:11:250 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:48:31:251 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:48:51:252 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:44:11 2019, skip new
snapshot

Wed Sep 18 06:49:11:253 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 142 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:49:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:49:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:49:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10979283, readCount 10979283)


UPD : 0 (peak 31, writeCount 2395, readCount 2395)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080675, readCount 1080675)
SPO : 0 (peak 2, writeCount 12698, readCount 12698)
UP2 : 0 (peak 1, writeCount 1214, readCount 1214)
DISP: 0 (peak 67, writeCount 422599, readCount 422599)
GW : 1 (peak 45, writeCount 10006649, readCount 10006648)
ICM : 1 (peak 186, writeCount 199554, readCount 199553)
LWP : 2 (peak 15, writeCount 19168, readCount 19166)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <IcmanQueue> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:49:11 2019


------------------------------------------------------------

Current snapshot id: 142


DB clean time (in percent of total time) : 24.52 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |171|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |171|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:49:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 2 logons with 2 sessions


Total ES (gross) memory of all sessions: 16 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:49:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)
MPI Info Wed Sep 18 06:49:11 2019
------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:49:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2883| 47| |
|
| 1|DDLOG | 2883| 47| |
|
| 2|BTCSCHED | 5768| 49| |
|
| 3|RESTART_ALL | 1153| 45| |
|
| 4|ENVCHECK | 17307| 20| |
|
| 5|AUTOABAP | 1153| 45| |
|
| 6|BGRFC_WATCHDOG | 1154| 45| |
|
| 7|AUTOTH | 1849| 55| |
|
| 8|AUTOCCMS | 5768| 49| |
|
| 9|AUTOSECURITY | 5768| 49| |
|
| 10|LOAD_CALCULATION | 345642| 0| |
|
| 11|SPOOLALRM | 5769| 49| |
|
| 12|CALL_DELAYED | 0| 1943| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 142 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:49:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:49:31:253 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:49:51:253 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:50:11:253 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-8624
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-8625
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 8626

Wed Sep 18 06:50:12:754 2019


*** ERROR => DpHdlDeadWp: W1 (pid 8624) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8624) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 8624)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 8625) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8625) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 8625)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:50:19:425 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:50:31:254 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 8626 terminated

Wed Sep 18 06:50:51:255 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:51:11:255 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
Wed Sep 18 06:51:31:256 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:51:51:256 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:52:11:257 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:52:31:257 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:52:51:257 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:53:11:258 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:53:31:259 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:53:51:259 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:54:11:259 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:54:31:260 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:54:51:260 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:50:11 2019, skip new
snapshot

Wed Sep 18 06:55:11:261 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 143 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:55:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 06:55:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 06:55:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10980104, readCount 10980104)


UPD : 0 (peak 31, writeCount 2396, readCount 2396)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080679, readCount 1080679)
SPO : 0 (peak 2, writeCount 12711, readCount 12711)
UP2 : 0 (peak 1, writeCount 1215, readCount 1215)
DISP: 0 (peak 67, writeCount 422654, readCount 422654)
GW : 0 (peak 45, writeCount 10007184, readCount 10007184)
ICM : 0 (peak 186, writeCount 199557, readCount 199557)
LWP : 2 (peak 15, writeCount 19183, readCount 19181)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 06:55:11 2019


------------------------------------------------------------

Current snapshot id: 143


DB clean time (in percent of total time) : 24.52 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |172|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |172|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 06:55:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 06:55:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 06:55:11 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 06:55:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2886| 47| |
|
| 1|DDLOG | 2886| 47| |
|
| 2|BTCSCHED | 5774| 49| |
|
| 3|RESTART_ALL | 1155| 285| |
|
| 4|ENVCHECK | 17325| 20| |
|
| 5|AUTOABAP | 1155| 285| |
|
| 6|BGRFC_WATCHDOG | 1156| 285| |
|
| 7|AUTOTH | 1855| 55| |
|
| 8|AUTOCCMS | 5774| 49| |
|
| 9|AUTOSECURITY | 5774| 49| |
|
| 10|LOAD_CALCULATION | 346001| 0| |
|
| 11|SPOOLALRM | 5775| 49| |
|
| 12|CALL_DELAYED | 0| 1583| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 143 (Reason: Workprocess 1 died / Time: Wed Sep 18
06:55:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-10565
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:55:11:266 2019


DpWpDynCreate: created new work process W12-10566

Wed Sep 18 06:55:12:808 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10565) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10565) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10565)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10566) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10566) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10566)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:55:31:261 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:55:51:262 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 06:56:11:262 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 10824

Wed Sep 18 06:56:19:088 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:56:31:263 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 10824 terminated

Wed Sep 18 06:56:51:263 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:57:11:264 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:57:31:265 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:57:51:266 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:58:11:266 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:58:31:267 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:58:51:267 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:59:11:267 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:59:31:268 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 06:59:51:269 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 07:00:11:270 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-12388
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-12389
Wed Sep 18 07:00:12:860 2019
*** ERROR => DpHdlDeadWp: W1 (pid 12388) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12388) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 12388)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 12389) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12389) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 12389)
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 07:00:31:271 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 07:00:51:272 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 06:56:11 2019, skip new
snapshot

Wed Sep 18 07:01:11:272 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 144 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:01:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:01:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:01:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10980933, readCount 10980933)


UPD : 0 (peak 31, writeCount 2397, readCount 2397)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080683, readCount 1080683)
SPO : 0 (peak 2, writeCount 12724, readCount 12724)
UP2 : 0 (peak 1, writeCount 1216, readCount 1216)
DISP: 0 (peak 67, writeCount 422695, readCount 422695)
GW : 0 (peak 45, writeCount 10007718, readCount 10007718)
ICM : 0 (peak 186, writeCount 199560, readCount 199560)
LWP : 0 (peak 15, writeCount 19198, readCount 19198)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:01:11 2019


------------------------------------------------------------

Current snapshot id: 144


DB clean time (in percent of total time) : 24.52 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |174|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |174|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:01:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:01:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:01:11 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:01:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2889| 47| |
|
| 1|DDLOG | 2889| 47| |
|
| 2|BTCSCHED | 5780| 49| |
|
| 3|RESTART_ALL | 1156| 225| |
|
| 4|ENVCHECK | 17343| 20| |
|
| 5|AUTOABAP | 1156| 225| |
|
| 6|BGRFC_WATCHDOG | 1157| 225| |
|
| 7|AUTOTH | 1861| 55| |
|
| 8|AUTOCCMS | 5780| 49| |
|
| 9|AUTOSECURITY | 5780| 49| |
|
| 10|LOAD_CALCULATION | 346360| 0| |
|
| 11|SPOOLALRM | 5781| 49| |
|
| 12|CALL_DELAYED | 0| 1223| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 144 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:01:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:01:31:273 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:01:51:273 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:02:11:274 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 15490

Wed Sep 18 07:02:18:843 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
Wed Sep 18 07:02:31:275 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 15490 terminated

Wed Sep 18 07:02:51:275 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:03:11:275 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:03:31:276 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:03:51:277 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:04:11:277 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:04:31:278 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:04:51:278 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:05:11:279 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-28075
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-28076

Wed Sep 18 07:05:12:917 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28075) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28075) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28075)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 28076) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28076) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28076)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:05:31:279 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:05:51:279 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:06:11:280 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:06:31:281 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:06:51:281 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:02:11 2019, skip new
snapshot

Wed Sep 18 07:07:11:282 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 145 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:07:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:07:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:07:11 2019


------------------------------------------------------------
Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10981973, readCount 10981973)


UPD : 0 (peak 31, writeCount 2398, readCount 2398)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080687, readCount 1080687)
SPO : 0 (peak 2, writeCount 12737, readCount 12737)
UP2 : 0 (peak 1, writeCount 1217, readCount 1217)
DISP: 0 (peak 67, writeCount 422738, readCount 422738)
GW : 0 (peak 45, writeCount 10008448, readCount 10008448)
ICM : 0 (peak 186, writeCount 199563, readCount 199563)
LWP : 0 (peak 15, writeCount 19213, readCount 19213)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:07:11 2019


------------------------------------------------------------

Current snapshot id: 145


DB clean time (in percent of total time) : 24.52 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |175|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |175|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:07:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T101_U15657_M0 |000| |SST-LAP-HP0055 |07:06:28|2 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:07:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:07:11 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:07:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2892| 47| |
|
| 1|DDLOG | 2892| 47| |
|
| 2|BTCSCHED | 5786| 49| |
|
| 3|RESTART_ALL | 1157| 165| |
|
| 4|ENVCHECK | 17361| 20| |
|
| 5|AUTOABAP | 1157| 165| |
|
| 6|BGRFC_WATCHDOG | 1158| 165| |
|
| 7|AUTOTH | 1867| 55| |
|
| 8|AUTOCCMS | 5786| 49| |
|
| 9|AUTOSECURITY | 5786| 49| |
|
| 10|LOAD_CALCULATION | 346719| 1| |
|
| 11|SPOOLALRM | 5787| 49| |
|
| 12|CALL_DELAYED | 0| 863| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 145 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:07:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:07:31:283 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:07:51:283 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:08:11:283 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 8877

Wed Sep 18 07:08:19:464 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:08:31:284 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 8877 terminated

Wed Sep 18 07:08:51:284 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:09:11:285 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:09:31:285 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:09:51:286 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:10:11:286 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-11450
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-11451

Wed Sep 18 07:10:12:974 2019


*** ERROR => DpHdlDeadWp: W1 (pid 11450) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11450) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 11450)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 11451) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=11451) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 11451)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:10:31:287 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:10:51:288 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:11:11:288 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:11:31:289 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:11:51:290 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:12:11:290 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:12:31:291 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:12:51:291 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:08:11 2019, skip new
snapshot

Wed Sep 18 07:13:11:291 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 146 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:13:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:13:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:13:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10982858, readCount 10982858)


UPD : 0 (peak 31, writeCount 2400, readCount 2400)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080695, readCount 1080695)
SPO : 0 (peak 2, writeCount 12751, readCount 12751)
UP2 : 0 (peak 1, writeCount 1219, readCount 1219)
DISP: 0 (peak 67, writeCount 422781, readCount 422781)
GW : 1 (peak 45, writeCount 10009024, readCount 10009023)
ICM : 0 (peak 186, writeCount 199570, readCount 199570)
LWP : 2 (peak 15, writeCount 19243, readCount 19241)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Dump of queue <GatewayQueue> in slot 1 (1 requests, in use, port=18689):
-1 <- 273 (rq_id 29475360, NOWP, REQ_HANDLER_CREATE_SNAPSHOT) -> -1
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:13:11 2019


------------------------------------------------------------

Current snapshot id: 146


DB clean time (in percent of total time) : 24.53 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |176|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |176|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:13:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T101_U15657_M0 |000| |SST-LAP-HP0055 |07:10:55|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:13:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:13:11 2019


------------------------------------------------------------
Current pipes in use: 223
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:13:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2895| 47| |
|
| 1|DDLOG | 2895| 47| |
|
| 2|BTCSCHED | 5792| 49| |
|
| 3|RESTART_ALL | 1158| 105| |
|
| 4|ENVCHECK | 17379| 20| |
|
| 5|AUTOABAP | 1158| 105| |
|
| 6|BGRFC_WATCHDOG | 1159| 105| |
|
| 7|AUTOTH | 1873| 55| |
|
| 8|AUTOCCMS | 5792| 49| |
|
| 9|AUTOSECURITY | 5792| 49| |
|
| 10|LOAD_CALCULATION | 347078| 1| |
|
| 11|SPOOLALRM | 5793| 49| |
|
| 12|CALL_DELAYED | 0| 503| |
|

Found 13 periodic tasks


********** SERVER SNAPSHOT 146 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:13:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:13:31:292 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:13:51:293 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:14:11:293 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 12440

Wed Sep 18 07:14:18:963 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:14:31:294 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 12440 terminated

Wed Sep 18 07:14:51:294 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:15:11:294 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-13072
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-13073

Wed Sep 18 07:15:12:715 2019


*** ERROR => DpHdlDeadWp: W1 (pid 13072) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13072) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 13072)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 13073) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13073) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13073)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:15:31:295 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:15:51:296 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:16:11:297 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:16:31:297 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:16:51:297 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:17:11:298 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:17:31:299 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:17:51:299 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:18:11:300 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:18:31:300 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:18:51:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:14:11 2019, skip new
snapshot

Wed Sep 18 07:19:11:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 147 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:19:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:19:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:19:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10983718, readCount 10983718)


UPD : 0 (peak 31, writeCount 2401, readCount 2401)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080699, readCount 1080699)
SPO : 0 (peak 2, writeCount 12764, readCount 12764)
UP2 : 0 (peak 1, writeCount 1220, readCount 1220)
DISP: 0 (peak 67, writeCount 422822, readCount 422822)
GW : 0 (peak 45, writeCount 10009594, readCount 10009594)
ICM : 0 (peak 186, writeCount 199573, readCount 199573)
LWP : 2 (peak 15, writeCount 19258, readCount 19256)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:19:11 2019


------------------------------------------------------------

Current snapshot id: 147


DB clean time (in percent of total time) : 24.53 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |177|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |177|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:19:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T101_U15657_M0 |000| |SST-LAP-HP0055 |07:10:55|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:19:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:19:11 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:19:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2898| 47| |
|
| 1|DDLOG | 2898| 47| |
|
| 2|BTCSCHED | 5798| 49| |
|
| 3|RESTART_ALL | 1159| 45| |
|
| 4|ENVCHECK | 17397| 20| |
|
| 5|AUTOABAP | 1159| 45| |
|
| 6|BGRFC_WATCHDOG | 1160| 45| |
|
| 7|AUTOTH | 1879| 55| |
|
| 8|AUTOCCMS | 5798| 49| |
|
| 9|AUTOSECURITY | 5798| 49| |
|
| 10|LOAD_CALCULATION | 347436| 0| |
|
| 11|SPOOLALRM | 5799| 49| |
|
| 12|CALL_DELAYED | 0| 143| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 147 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:19:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:19:31:302 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Wed Sep 18 07:19:51:302 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:20:11:303 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-14555
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-14556
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 14557

Wed Sep 18 07:20:13:022 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14555) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14555) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14555)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14556) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14556) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14556)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:20:18:912 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:20:31:304 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 14557 terminated

Wed Sep 18 07:20:51:304 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:21:11:305 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:21:31:306 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:21:51:306 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:22:11:307 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:22:31:308 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:22:51:308 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:23:11:308 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:23:31:308 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:23:51:309 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:24:11:309 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:24:31:310 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:24:51:310 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:20:11 2019, skip new
snapshot

Wed Sep 18 07:25:11:311 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 148 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:25:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:25:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:25:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10984544, readCount 10984544)


UPD : 0 (peak 31, writeCount 2402, readCount 2402)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080703, readCount 1080703)
SPO : 0 (peak 2, writeCount 12777, readCount 12777)
UP2 : 0 (peak 1, writeCount 1221, readCount 1221)
DISP: 0 (peak 67, writeCount 422868, readCount 422868)
GW : 0 (peak 45, writeCount 10010137, readCount 10010137)
ICM : 0 (peak 186, writeCount 199576, readCount 199576)
LWP : 2 (peak 15, writeCount 19273, readCount 19271)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:25:11 2019


------------------------------------------------------------

Current snapshot id: 148


DB clean time (in percent of total time) : 24.53 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |178|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |178|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:25:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T101_U15657_M0 |000| |SST-LAP-HP0055 |07:10:55|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:25:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:25:11 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:25:11 2019


------------------------------------------------------------
|Handle |Type |Calls |Wait(sec) |Session |Resp-ID
|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2901| 47| |
|
| 1|DDLOG | 2901| 47| |
|
| 2|BTCSCHED | 5804| 49| |
|
| 3|RESTART_ALL | 1161| 285| |
|
| 4|ENVCHECK | 17415| 20| |
|
| 5|AUTOABAP | 1161| 285| |
|
| 6|BGRFC_WATCHDOG | 1162| 285| |
|
| 7|AUTOTH | 1885| 55| |
|
| 8|AUTOCCMS | 5804| 49| |
|
| 9|AUTOSECURITY | 5804| 49| |
|
| 10|LOAD_CALCULATION | 347795| 1| |
|
| 11|SPOOLALRM | 5805| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 148 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:25:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-16143
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:25:11:317 2019


DpWpDynCreate: created new work process W12-16144

Wed Sep 18 07:25:12:871 2019


*** ERROR => DpHdlDeadWp: W1 (pid 16143) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16143) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 16143)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 16144) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=16144) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 16144)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:25:31:312 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:25:51:313 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:26:11:313 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 16617

Wed Sep 18 07:26:19:007 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:26:31:314 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 16617 terminated

Wed Sep 18 07:26:51:315 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:27:11:316 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:27:31:316 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:27:51:317 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:28:11:317 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:28:31:318 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:28:51:319 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:29:11:319 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:29:31:319 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:29:51:320 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:30:11:320 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-18061
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-18062

Wed Sep 18 07:30:12:903 2019


*** ERROR => DpHdlDeadWp: W1 (pid 18061) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18061) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 18061)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 18062) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18062) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18062)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:30:31:322 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:30:51:322 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:26:11 2019, skip new
snapshot

Wed Sep 18 07:31:11:323 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 149 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:31:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:31:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:31:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10985375, readCount 10985375)


UPD : 0 (peak 31, writeCount 2403, readCount 2403)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080707, readCount 1080707)
SPO : 0 (peak 2, writeCount 12790, readCount 12790)
UP2 : 0 (peak 1, writeCount 1222, readCount 1222)
DISP: 0 (peak 67, writeCount 422909, readCount 422909)
GW : 0 (peak 45, writeCount 10010671, readCount 10010671)
ICM : 0 (peak 186, writeCount 199579, readCount 199579)
LWP : 0 (peak 15, writeCount 19288, readCount 19288)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:31:11 2019


------------------------------------------------------------

Current snapshot id: 149


DB clean time (in percent of total time) : 24.53 %
Number of preemptions : 78
|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|
Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |180|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |180|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:31:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T101_U15657_M0 |000| |SST-LAP-HP0055 |07:10:55|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:31:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:31:11 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:31:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2904| 47| |
|
| 1|DDLOG | 2904| 47| |
|
| 2|BTCSCHED | 5810| 49| |
|
| 3|RESTART_ALL | 1162| 225| |
|
| 4|ENVCHECK | 17433| 20| |
|
| 5|AUTOABAP | 1162| 225| |
|
| 6|BGRFC_WATCHDOG | 1163| 225| |
|
| 7|AUTOTH | 1891| 55| |
|
| 8|AUTOCCMS | 5810| 49| |
|
| 9|AUTOSECURITY | 5810| 49| |
|
| 10|LOAD_CALCULATION | 348153| 0| |
|
| 11|SPOOLALRM | 5811| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 149 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:31:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:31:31:324 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:31:51:325 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:32:11:325 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 18514

Wed Sep 18 07:32:18:790 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:32:31:326 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 18514 terminated

Wed Sep 18 07:32:51:327 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:33:11:327 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:33:31:328 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:33:51:328 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:34:11:329 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:34:31:329 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:34:51:329 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:35:11:329 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-19737
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-19738

Wed Sep 18 07:35:12:802 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19737) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19737) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19737)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19738) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19738) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19738)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:35:31:330 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:35:51:330 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:36:11:330 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:36:31:331 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:36:51:331 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:32:11 2019, skip new
snapshot

Wed Sep 18 07:37:11:331 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 150 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:37:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:37:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:37:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10986229, readCount 10986229)


UPD : 0 (peak 31, writeCount 2404, readCount 2404)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080711, readCount 1080711)
SPO : 0 (peak 2, writeCount 12803, readCount 12803)
UP2 : 0 (peak 1, writeCount 1223, readCount 1223)
DISP: 0 (peak 67, writeCount 422951, readCount 422951)
GW : 0 (peak 45, writeCount 10011215, readCount 10011215)
ICM : 0 (peak 186, writeCount 199582, readCount 199582)
LWP : 0 (peak 15, writeCount 19303, readCount 19303)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:37:11 2019


------------------------------------------------------------

Current snapshot id: 150


DB clean time (in percent of total time) : 24.54 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |181|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |181|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:37:11 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:37:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:37:11 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:37:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2907| 47| |
|
| 1|DDLOG | 2907| 47| |
|
| 2|BTCSCHED | 5816| 49| |
|
| 3|RESTART_ALL | 1163| 165| |
|
| 4|ENVCHECK | 17451| 20| |
|
| 5|AUTOABAP | 1163| 165| |
|
| 6|BGRFC_WATCHDOG | 1164| 165| |
|
| 7|AUTOTH | 1897| 55| |
|
| 8|AUTOCCMS | 5816| 49| |
|
| 9|AUTOSECURITY | 5816| 49| |
|
| 10|LOAD_CALCULATION | 348511| 0| |
|
| 11|SPOOLALRM | 5817| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 150 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:37:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:37:31:331 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:37:51:332 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:38:11:333 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 20967

Wed Sep 18 07:38:18:885 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:38:31:333 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 20967 terminated

Wed Sep 18 07:38:51:334 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:39:11:335 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:39:31:335 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:39:51:336 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:40:11:337 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21840
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-21841

Wed Sep 18 07:40:13:044 2019


*** ERROR => DpHdlDeadWp: W1 (pid 21840) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21840) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21840)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 21841) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21841) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 21841)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:40:31:337 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
Wed Sep 18 07:40:51:338 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:41:11:338 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:41:31:339 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:41:51:340 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:42:11:340 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:42:31:341 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:42:51:341 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:38:11 2019, skip new
snapshot

Wed Sep 18 07:43:11:342 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
********** SERVER SNAPSHOT 151 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:43:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:43:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:43:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10987276, readCount 10987276)


UPD : 0 (peak 31, writeCount 2406, readCount 2406)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080719, readCount 1080719)
SPO : 0 (peak 2, writeCount 12817, readCount 12817)
UP2 : 0 (peak 1, writeCount 1225, readCount 1225)
DISP: 0 (peak 67, writeCount 422992, readCount 422992)
GW : 0 (peak 45, writeCount 10011973, readCount 10011973)
ICM : 0 (peak 186, writeCount 199587, readCount 199587)
LWP : 2 (peak 15, writeCount 19333, readCount 19331)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:43:11 2019


------------------------------------------------------------

Current snapshot id: 151


DB clean time (in percent of total time) : 24.54 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |182|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |182|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:43:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
Found 3 logons with 3 sessions
Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:43:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:43:11 2019


------------------------------------------------------------
Current pipes in use: 231
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:43:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2910| 47| |
|
| 1|DDLOG | 2910| 47| |
|
| 2|BTCSCHED | 5822| 49| |
|
| 3|RESTART_ALL | 1164| 105| |
|
| 4|ENVCHECK | 17469| 20| |
|
| 5|AUTOABAP | 1164| 105| |
|
| 6|BGRFC_WATCHDOG | 1165| 105| |
|
| 7|AUTOTH | 1903| 55| |
|
| 8|AUTOCCMS | 5822| 49| |
|
| 9|AUTOSECURITY | 5822| 49| |
|
| 10|LOAD_CALCULATION | 348870| 1| |
|
| 11|SPOOLALRM | 5823| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 151 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:43:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Wed Sep 18 07:43:31:343 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:43:51:343 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:44:11:344 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22882

Wed Sep 18 07:44:19:438 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:44:31:344 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22882 terminated

Wed Sep 18 07:44:51:345 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:45:11:346 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-23683
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-23684

Wed Sep 18 07:45:13:065 2019


*** ERROR => DpHdlDeadWp: W1 (pid 23683) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23683) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 23683)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 23684) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23684) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 23684)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:45:31:346 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:45:51:346 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:46:11:347 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:46:31:347 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:46:51:348 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:47:11:349 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:47:31:350 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:47:51:351 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:48:11:351 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:48:31:352 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:48:51:359 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:44:11 2019, skip new
snapshot

Wed Sep 18 07:49:11:360 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 152 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:49:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:49:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:49:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10988116, readCount 10988116)


UPD : 0 (peak 31, writeCount 2407, readCount 2407)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080723, readCount 1080723)
SPO : 0 (peak 2, writeCount 12830, readCount 12830)
UP2 : 0 (peak 1, writeCount 1226, readCount 1226)
DISP: 0 (peak 67, writeCount 423035, readCount 423035)
GW : 0 (peak 45, writeCount 10012531, readCount 10012531)
ICM : 0 (peak 186, writeCount 199590, readCount 199590)
LWP : 2 (peak 15, writeCount 19348, readCount 19346)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests
Workprocess Table (long) Wed Sep 18 07:49:11 2019
------------------------------------------------------------

Current snapshot id: 152


DB clean time (in percent of total time) : 24.54 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |183|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |183|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:49:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T39_U18787_M0 |000| |SST-LAP-LEN0028 |07:48:20|4 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U14445_M0 |000| |SST-LAP-LEN0028 |06:54:11|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:49:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:49:11 2019


------------------------------------------------------------
Current pipes in use: 205
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:49:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2913| 47| |
|
| 1|DDLOG | 2913| 47| |
|
| 2|BTCSCHED | 5828| 49| |
|
| 3|RESTART_ALL | 1165| 45| |
|
| 4|ENVCHECK | 17487| 20| |
|
| 5|AUTOABAP | 1165| 45| |
|
| 6|BGRFC_WATCHDOG | 1166| 45| |
|
| 7|AUTOTH | 1909| 55| |
|
| 8|AUTOCCMS | 5828| 49| |
|
| 9|AUTOSECURITY | 5828| 49| |
|
| 10|LOAD_CALCULATION | 349229| 1| |
|
| 11|SPOOLALRM | 5829| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 152 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:49:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:49:31:361 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:49:51:361 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:50:11:362 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-25384
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-25385
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 25386

Wed Sep 18 07:50:13:044 2019


*** ERROR => DpHdlDeadWp: W1 (pid 25384) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25384) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25384)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 25385) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25385) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 25385)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:50:18:573 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:50:31:362 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 25386 terminated

Wed Sep 18 07:50:51:363 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:51:11:364 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:51:31:364 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:51:51:365 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:52:11:365 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:52:31:367 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:52:51:367 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:53:11:367 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:53:31:368 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:53:51:368 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
Wed Sep 18 07:54:11:369 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:54:31:369 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:54:51:370 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:50:11 2019, skip new
snapshot

Wed Sep 18 07:55:11:371 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 153 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:55:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 07:55:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 07:55:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10988953, readCount 10988953)


UPD : 0 (peak 31, writeCount 2408, readCount 2408)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080727, readCount 1080727)
SPO : 0 (peak 2, writeCount 12843, readCount 12843)
UP2 : 0 (peak 1, writeCount 1227, readCount 1227)
DISP: 0 (peak 67, writeCount 423083, readCount 423083)
GW : 0 (peak 45, writeCount 10013084, readCount 10013084)
ICM : 0 (peak 186, writeCount 199593, readCount 199593)
LWP : 2 (peak 15, writeCount 19363, readCount 19361)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 07:55:11 2019


------------------------------------------------------------

Current snapshot id: 153


DB clean time (in percent of total time) : 24.55 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |184|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |184|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 07:55:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 07:55:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 07:55:11 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 07:55:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2916| 47| |
|
| 1|DDLOG | 2916| 47| |
|
| 2|BTCSCHED | 5834| 49| |
|
| 3|RESTART_ALL | 1167| 285| |
|
| 4|ENVCHECK | 17505| 20| |
|
| 5|AUTOABAP | 1167| 285| |
|
| 6|BGRFC_WATCHDOG | 1168| 285| |
|
| 7|AUTOTH | 1915| 55| |
|
| 8|AUTOCCMS | 5834| 49| |
|
| 9|AUTOSECURITY | 5834| 49| |
|
| 10|LOAD_CALCULATION | 349587| 1| |
|
| 11|SPOOLALRM | 5835| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 153 (Reason: Workprocess 1 died / Time: Wed Sep 18
07:55:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-27185
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:55:11:376 2019


DpWpDynCreate: created new work process W12-27186

Wed Sep 18 07:55:13:080 2019


*** ERROR => DpHdlDeadWp: W1 (pid 27185) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27185) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27185)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 27186) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27186) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 27186)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:55:31:371 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:55:51:372 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 07:56:11:372 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 27449

Wed Sep 18 07:56:18:324 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:56:31:372 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 27449 terminated

Wed Sep 18 07:56:51:373 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:57:11:374 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:57:31:374 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:57:51:375 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:58:11:375 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:58:31:376 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:58:51:376 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:59:11:377 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:59:31:377 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 07:59:51:378 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 08:00:11:379 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-29092
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-29093

Wed Sep 18 08:00:13:103 2019


*** ERROR => DpHdlDeadWp: W1 (pid 29092) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29092) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 29092)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 29093) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29093) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 29093)
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 08:00:31:380 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 08:00:51:380 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 07:56:11 2019, skip new
snapshot

Wed Sep 18 08:01:11:381 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 154 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:01:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:01:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:01:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10989777, readCount 10989777)


UPD : 0 (peak 31, writeCount 2409, readCount 2409)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080731, readCount 1080731)
SPO : 0 (peak 2, writeCount 12856, readCount 12856)
UP2 : 0 (peak 1, writeCount 1228, readCount 1228)
DISP: 0 (peak 67, writeCount 423124, readCount 423124)
GW : 0 (peak 45, writeCount 10013606, readCount 10013606)
ICM : 0 (peak 186, writeCount 199596, readCount 199596)
LWP : 0 (peak 15, writeCount 19378, readCount 19378)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:01:11 2019


------------------------------------------------------------

Current snapshot id: 154


DB clean time (in percent of total time) : 24.55 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |186|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |186|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|
Found 2 active workprocesses
Total number of workprocesses is 16

Session Table Wed Sep 18 08:01:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:01:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:01:11 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:01:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2919| 47| |
|
| 1|DDLOG | 2919| 47| |
|
| 2|BTCSCHED | 5840| 49| |
|
| 3|RESTART_ALL | 1168| 225| |
|
| 4|ENVCHECK | 17523| 20| |
|
| 5|AUTOABAP | 1168| 225| |
|
| 6|BGRFC_WATCHDOG | 1169| 225| |
|
| 7|AUTOTH | 1921| 55| |
|
| 8|AUTOCCMS | 5840| 49| |
|
| 9|AUTOSECURITY | 5840| 49| |
|
| 10|LOAD_CALCULATION | 349946| 1| |
|
| 11|SPOOLALRM | 5841| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 154 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:01:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:01:31:381 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:01:51:382 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:02:11:382 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 7171

Wed Sep 18 08:02:18:258 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:02:31:383 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 7171 terminated
Wed Sep 18 08:02:51:383 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:03:11:384 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:03:31:385 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:03:51:385 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:04:11:386 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:04:31:387 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:04:51:387 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:05:11:388 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-19625
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-19626

Wed Sep 18 08:05:13:158 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19625) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19625) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19625)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19626) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19626) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19626)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:05:31:388 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:05:51:389 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:06:11:389 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:06:31:390 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:06:51:390 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:02:11 2019, skip new
snapshot

Wed Sep 18 08:07:11:391 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 155 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:07:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:07:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:07:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10990859, readCount 10990859)


UPD : 0 (peak 31, writeCount 2410, readCount 2410)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080735, readCount 1080735)
SPO : 0 (peak 2, writeCount 12869, readCount 12869)
UP2 : 0 (peak 1, writeCount 1229, readCount 1229)
DISP: 0 (peak 67, writeCount 423165, readCount 423165)
GW : 0 (peak 45, writeCount 10014372, readCount 10014372)
ICM : 0 (peak 186, writeCount 199599, readCount 199599)
LWP : 0 (peak 15, writeCount 19393, readCount 19393)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:07:11 2019


------------------------------------------------------------

Current snapshot id: 155


DB clean time (in percent of total time) : 24.55 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |187|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |187|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:07:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:07:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:07:11 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:07:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2922| 47| |
|
| 1|DDLOG | 2922| 47| |
|
| 2|BTCSCHED | 5846| 49| |
|
| 3|RESTART_ALL | 1169| 165| |
|
| 4|ENVCHECK | 17541| 20| |
|
| 5|AUTOABAP | 1169| 165| |
|
| 6|BGRFC_WATCHDOG | 1170| 165| |
|
| 7|AUTOTH | 1927| 55| |
|
| 8|AUTOCCMS | 5846| 49| |
|
| 9|AUTOSECURITY | 5846| 49| |
|
| 10|LOAD_CALCULATION | 350305| 1| |
|
| 11|SPOOLALRM | 5847| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 155 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:07:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:07:31:392 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:07:51:393 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:08:11:393 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 763

Wed Sep 18 08:08:18:288 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:08:31:394 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 763 terminated

Wed Sep 18 08:08:51:395 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:09:11:396 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:09:31:397 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:09:51:397 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:10:11:398 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-1721
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-1723

Wed Sep 18 08:10:13:090 2019


*** ERROR => DpHdlDeadWp: W1 (pid 1721) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1721) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 1721)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 1723) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1723) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 1723)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:10:31:399 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:10:51:399 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:11:11:399 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
Wed Sep 18 08:11:31:399 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:11:51:399 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:12:11:400 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:12:31:400 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:12:51:400 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:08:11 2019, skip new
snapshot

Wed Sep 18 08:13:11:401 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 156 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:13:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:13:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:13:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10991751, readCount 10991751)


UPD : 0 (peak 31, writeCount 2412, readCount 2412)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080743, readCount 1080743)
SPO : 0 (peak 2, writeCount 12883, readCount 12883)
UP2 : 0 (peak 1, writeCount 1231, readCount 1231)
DISP: 0 (peak 67, writeCount 423206, readCount 423206)
GW : 0 (peak 45, writeCount 10014984, readCount 10014984)
ICM : 0 (peak 186, writeCount 199606, readCount 199606)
LWP : 2 (peak 15, writeCount 19423, readCount 19421)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:13:11 2019


------------------------------------------------------------

Current snapshot id: 156


DB clean time (in percent of total time) : 24.55 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |188|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |188|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:13:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:13:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:13:11 2019


------------------------------------------------------------
Current pipes in use: 219
Current / maximal blocks in use: 0 / 1884
Periodic Tasks Wed Sep 18 08:13:11 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2925| 47| |
|
| 1|DDLOG | 2925| 47| |
|
| 2|BTCSCHED | 5852| 49| |
|
| 3|RESTART_ALL | 1170| 105| |
|
| 4|ENVCHECK | 17559| 20| |
|
| 5|AUTOABAP | 1170| 105| |
|
| 6|BGRFC_WATCHDOG | 1171| 105| |
|
| 7|AUTOTH | 1933| 55| |
|
| 8|AUTOCCMS | 5852| 49| |
|
| 9|AUTOSECURITY | 5852| 49| |
|
| 10|LOAD_CALCULATION | 350663| 1| |
|
| 11|SPOOLALRM | 5853| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 156 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:13:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:13:31:401 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:13:51:402 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:14:11:403 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 3236

Wed Sep 18 08:14:18:651 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:14:31:403 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 3236 terminated

Wed Sep 18 08:14:51:404 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:15:11:405 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-3806
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-3807

Wed Sep 18 08:15:13:169 2019


*** ERROR => DpHdlDeadWp: W1 (pid 3806) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3806) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 3806)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 3807) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W12 (pid = 3807)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:15:31:405 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:15:51:405 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:16:11:406 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:16:31:406 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:16:51:407 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:17:11:408 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:17:31:408 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:17:51:409 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:18:11:409 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:18:31:410 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:18:51:411 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:14:11 2019, skip new
snapshot

Wed Sep 18 08:19:11:412 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 157 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:19:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:19:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:19:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10992604, readCount 10992604)


UPD : 0 (peak 31, writeCount 2413, readCount 2413)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080747, readCount 1080747)
SPO : 0 (peak 2, writeCount 12896, readCount 12896)
UP2 : 0 (peak 1, writeCount 1232, readCount 1232)
DISP: 0 (peak 67, writeCount 423247, readCount 423247)
GW : 0 (peak 45, writeCount 10015554, readCount 10015554)
ICM : 0 (peak 186, writeCount 199609, readCount 199609)
LWP : 2 (peak 15, writeCount 19438, readCount 19436)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:19:11 2019


------------------------------------------------------------

Current snapshot id: 157


DB clean time (in percent of total time) : 24.56 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |189|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |189|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:19:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:19:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:19:11 2019


------------------------------------------------------------
Current pipes in use: 207
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:19:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2928| 47| |
|
| 1|DDLOG | 2928| 47| |
|
| 2|BTCSCHED | 5858| 49| |
|
| 3|RESTART_ALL | 1171| 45| |
|
| 4|ENVCHECK | 17577| 20| |
|
| 5|AUTOABAP | 1171| 45| |
|
| 6|BGRFC_WATCHDOG | 1172| 45| |
|
| 7|AUTOTH | 1939| 55| |
|
| 8|AUTOCCMS | 5858| 49| |
|
| 9|AUTOSECURITY | 5858| 49| |
|
| 10|LOAD_CALCULATION | 351022| 1| |
|
| 11|SPOOLALRM | 5859| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 157 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:19:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:19:31:412 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:19:51:412 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:20:11:413 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-5241
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-5242
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 5243

Wed Sep 18 08:20:13:122 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5241) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5241) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5241)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5242) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5242) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5242)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:20:18:205 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:20:31:414 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 5243 terminated

Wed Sep 18 08:20:51:414 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:21:11:415 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:21:31:415 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:21:51:415 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:22:11:416 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
Wed Sep 18 08:22:31:416 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:22:51:417 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:23:11:418 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:23:31:418 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:23:51:419 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:24:11:419 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:24:31:420 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:24:51:420 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:20:11 2019, skip new
snapshot

Wed Sep 18 08:25:11:421 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 158 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:25:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:25:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:25:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10993461, readCount 10993461)


UPD : 0 (peak 31, writeCount 2414, readCount 2414)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080751, readCount 1080751)
SPO : 0 (peak 2, writeCount 12909, readCount 12909)
UP2 : 0 (peak 1, writeCount 1233, readCount 1233)
DISP: 0 (peak 67, writeCount 423295, readCount 423295)
GW : 0 (peak 45, writeCount 10016111, readCount 10016111)
ICM : 0 (peak 186, writeCount 199612, readCount 199612)
LWP : 2 (peak 15, writeCount 19453, readCount 19451)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:25:11 2019


------------------------------------------------------------

Current snapshot id: 158


DB clean time (in percent of total time) : 24.56 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |190|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |190|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:25:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:25:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:25:11 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:25:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2931| 47| |
|
| 1|DDLOG | 2931| 47| |
|
| 2|BTCSCHED | 5864| 49| |
|
| 3|RESTART_ALL | 1173| 285| |
|
| 4|ENVCHECK | 17595| 20| |
|
| 5|AUTOABAP | 1173| 285| |
|
| 6|BGRFC_WATCHDOG | 1174| 285| |
|
| 7|AUTOTH | 1945| 55| |
|
| 8|AUTOCCMS | 5864| 49| |
|
| 9|AUTOSECURITY | 5864| 49| |
|
| 10|LOAD_CALCULATION | 351380| 0| |
|
| 11|SPOOLALRM | 5865| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 158 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:25:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-7330
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:25:11:427 2019


DpWpDynCreate: created new work process W12-7331

Wed Sep 18 08:25:13:142 2019


*** ERROR => DpHdlDeadWp: W1 (pid 7330) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7330) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 7330)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 7331) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=7331) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 7331)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:25:31:422 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:25:51:423 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:26:11:423 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 7522

Wed Sep 18 08:26:18:491 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:26:31:424 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 7522 terminated

Wed Sep 18 08:26:51:424 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:27:11:425 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:27:31:425 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:27:51:426 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:28:11:426 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:28:31:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:28:51:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:29:11:427 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:29:31:428 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:29:51:428 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:30:11:428 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-9015
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-9016

Wed Sep 18 08:30:13:128 2019


*** ERROR => DpHdlDeadWp: W1 (pid 9015) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9015) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 9015)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 9016) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=9016) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 9016)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:30:31:429 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
Wed Sep 18 08:30:51:430 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:26:11 2019, skip new
snapshot

Wed Sep 18 08:31:11:430 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 159 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:31:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:31:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:31:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000


DIA : 0 (peak 291, writeCount 10994295, readCount 10994295)
UPD : 0 (peak 31, writeCount 2415, readCount 2415)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080755, readCount 1080755)
SPO : 0 (peak 2, writeCount 12922, readCount 12922)
UP2 : 0 (peak 1, writeCount 1234, readCount 1234)
DISP: 0 (peak 67, writeCount 423336, readCount 423336)
GW : 0 (peak 45, writeCount 10016651, readCount 10016651)
ICM : 0 (peak 186, writeCount 199615, readCount 199615)
LWP : 0 (peak 15, writeCount 19468, readCount 19468)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:31:11 2019


------------------------------------------------------------

Current snapshot id: 159


DB clean time (in percent of total time) : 24.56 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |192|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |192|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:31:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:31:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:31:11 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:31:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2934| 47| |
|
| 1|DDLOG | 2934| 47| |
|
| 2|BTCSCHED | 5870| 49| |
|
| 3|RESTART_ALL | 1174| 225| |
|
| 4|ENVCHECK | 17613| 20| |
|
| 5|AUTOABAP | 1174| 225| |
|
| 6|BGRFC_WATCHDOG | 1175| 225| |
|
| 7|AUTOTH | 1951| 55| |
|
| 8|AUTOCCMS | 5870| 49| |
|
| 9|AUTOSECURITY | 5870| 49| |
|
| 10|LOAD_CALCULATION | 351739| 1| |
|
| 11|SPOOLALRM | 5871| 49| |
|
Found 12 periodic tasks

********** SERVER SNAPSHOT 159 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:31:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:31:31:431 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:31:51:431 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:32:11:432 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9729

Wed Sep 18 08:32:17:994 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:32:31:432 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9729 terminated

Wed Sep 18 08:32:51:433 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:33:11:433 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:33:31:434 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:33:51:435 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:34:11:435 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:34:31:436 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:34:51:436 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:35:11:437 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-10914
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-10915

Wed Sep 18 08:35:13:154 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10914) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10914) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10914)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10915) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10915) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10915)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:35:31:437 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:35:51:438 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:36:11:439 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:36:31:439 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:36:51:440 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:32:11 2019, skip new
snapshot

Wed Sep 18 08:37:11:441 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 160 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:37:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:37:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:37:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10995146, readCount 10995146)


UPD : 0 (peak 31, writeCount 2416, readCount 2416)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080759, readCount 1080759)
SPO : 0 (peak 2, writeCount 12935, readCount 12935)
UP2 : 0 (peak 1, writeCount 1235, readCount 1235)
DISP: 0 (peak 67, writeCount 423379, readCount 423379)
GW : 1 (peak 45, writeCount 10017183, readCount 10017182)
ICM : 0 (peak 186, writeCount 199618, readCount 199618)
LWP : 0 (peak 15, writeCount 19483, readCount 19483)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:37:11 2019


------------------------------------------------------------

Current snapshot id: 160


DB clean time (in percent of total time) : 24.57 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |193|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |193|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:37:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:37:11 2019


CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:37:11 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:37:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2937| 47| |
|
| 1|DDLOG | 2937| 47| |
|
| 2|BTCSCHED | 5876| 49| |
|
| 3|RESTART_ALL | 1175| 165| |
|
| 4|ENVCHECK | 17631| 20| |
|
| 5|AUTOABAP | 1175| 165| |
|
| 6|BGRFC_WATCHDOG | 1176| 165| |
|
| 7|AUTOTH | 1957| 55| |
|
| 8|AUTOCCMS | 5876| 49| |
|
| 9|AUTOSECURITY | 5876| 49| |
|
| 10|LOAD_CALCULATION | 352098| 1| |
|
| 11|SPOOLALRM | 5877| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 160 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:37:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:37:31:442 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Wed Sep 18 08:37:51:443 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:38:11:443 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 11640

Wed Sep 18 08:38:18:563 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:38:31:444 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 11640 terminated

Wed Sep 18 08:38:51:444 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:39:11:445 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:39:31:445 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:39:51:446 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:40:11:446 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-12550
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-12551

Wed Sep 18 08:40:13:173 2019


*** ERROR => DpHdlDeadWp: W1 (pid 12550) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12550) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 12550)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 12551) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=12551) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 12551)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:40:31:446 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:40:51:447 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:41:11:447 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:41:31:447 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
Wed Sep 18 08:41:51:448 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:42:11:449 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:42:31:449 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:42:51:450 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:38:11 2019, skip new
snapshot

Wed Sep 18 08:43:11:451 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 161 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:43:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:43:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:43:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10995980, readCount 10995980)


UPD : 0 (peak 31, writeCount 2418, readCount 2418)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080767, readCount 1080767)
SPO : 0 (peak 2, writeCount 12949, readCount 12949)
UP2 : 0 (peak 1, writeCount 1237, readCount 1237)
DISP: 0 (peak 67, writeCount 423423, readCount 423423)
GW : 0 (peak 45, writeCount 10017717, readCount 10017717)
ICM : 1 (peak 186, writeCount 199623, readCount 199622)
LWP : 2 (peak 15, writeCount 19513, readCount 19511)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:43:11 2019


------------------------------------------------------------

Current snapshot id: 161


DB clean time (in percent of total time) : 24.57 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |194|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |194|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:43:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:43:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:43:11 2019


------------------------------------------------------------
Current pipes in use: 233
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:43:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2940| 47| |
|
| 1|DDLOG | 2940| 47| |
|
| 2|BTCSCHED | 5882| 49| |
|
| 3|RESTART_ALL | 1176| 105| |
|
| 4|ENVCHECK | 17649| 20| |
|
| 5|AUTOABAP | 1176| 105| |
|
| 6|BGRFC_WATCHDOG | 1177| 105| |
|
| 7|AUTOTH | 1963| 55| |
|
| 8|AUTOCCMS | 5882| 49| |
|
| 9|AUTOSECURITY | 5882| 49| |
|
| 10|LOAD_CALCULATION | 352457| 1| |
|
| 11|SPOOLALRM | 5883| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 161 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:43:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:43:31:451 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:43:51:452 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:44:11:452 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 13824

Wed Sep 18 08:44:18:365 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:44:31:453 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 13824 terminated

Wed Sep 18 08:44:51:454 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:45:11:455 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-14473
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-14474

Wed Sep 18 08:45:13:114 2019


*** ERROR => DpHdlDeadWp: W1 (pid 14473) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14473) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14473)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 14474) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14474) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 14474)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:45:31:455 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
Wed Sep 18 08:45:51:456 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:46:11:456 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:46:31:457 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:46:51:457 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:47:11:457 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:47:31:458 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:47:51:459 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:48:11:459 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:48:31:459 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:48:51:460 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:44:11 2019, skip new
snapshot

Wed Sep 18 08:49:11:461 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 162 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:49:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:49:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:49:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10996798, readCount 10996798)


UPD : 0 (peak 31, writeCount 2419, readCount 2419)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080771, readCount 1080771)
SPO : 0 (peak 2, writeCount 12962, readCount 12962)
UP2 : 0 (peak 1, writeCount 1238, readCount 1238)
DISP: 0 (peak 67, writeCount 423467, readCount 423467)
GW : 0 (peak 45, writeCount 10018263, readCount 10018263)
ICM : 0 (peak 186, writeCount 199626, readCount 199626)
LWP : 2 (peak 15, writeCount 19528, readCount 19526)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:49:11 2019


------------------------------------------------------------

Current snapshot id: 162


DB clean time (in percent of total time) : 24.57 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |195|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 6|32125 |DIA |WP_RUN | | |norm|T66_U23293_M0 |HTTP_NORM| | |
1|<HANDLE PLUGIN> |000| | |
|
| 12| |BTC |WP_KILL| |195|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:49:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|HTTP_NORMAL |T66_U23293_M0 |000| |10.54.36.33 |08:49:10|6 |
SAPMHTTP |norm| |
| | 4590|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 5 logons with 5 sessions


Total ES (gross) memory of all sessions: 29 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:49:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:49:11 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:49:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2943| 47| |
|
| 1|DDLOG | 2943| 47| |
|
| 2|BTCSCHED | 5888| 49| |
|
| 3|RESTART_ALL | 1177| 45| |
|
| 4|ENVCHECK | 17667| 20| |
|
| 5|AUTOABAP | 1177| 45| |
|
| 6|BGRFC_WATCHDOG | 1178| 45| |
|
| 7|AUTOTH | 1969| 55| |
|
| 8|AUTOCCMS | 5888| 49| |
|
| 9|AUTOSECURITY | 5888| 49| |
|
| 10|LOAD_CALCULATION | 352815| 1| |
|
| 11|SPOOLALRM | 5889| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 162 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:49:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:49:31:461 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:49:51:462 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:50:11:462 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-15943
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-15944
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 15945

Wed Sep 18 08:50:13:127 2019


*** ERROR => DpHdlDeadWp: W1 (pid 15943) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15943) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 15943)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 15944) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=15944) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 15944)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:50:18:517 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:50:31:463 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 15945 terminated

Wed Sep 18 08:50:51:463 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:51:11:464 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:51:31:465 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:51:51:464 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
Wed Sep 18 08:52:11:464 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:52:31:465 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:52:51:466 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:53:11:466 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:53:31:467 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:53:51:467 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:54:11:468 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:54:31:469 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:54:51:469 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:50:11 2019, skip new
snapshot

Wed Sep 18 08:55:11:470 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 163 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:55:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 08:55:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 08:55:11 2019


------------------------------------------------------------
Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10997692, readCount 10997692)


UPD : 0 (peak 31, writeCount 2420, readCount 2420)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080775, readCount 1080775)
SPO : 0 (peak 2, writeCount 12975, readCount 12975)
UP2 : 0 (peak 1, writeCount 1239, readCount 1239)
DISP: 0 (peak 67, writeCount 423512, readCount 423512)
GW : 0 (peak 45, writeCount 10018828, readCount 10018828)
ICM : 0 (peak 186, writeCount 199629, readCount 199629)
LWP : 2 (peak 15, writeCount 19543, readCount 19541)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 08:55:11 2019


------------------------------------------------------------

Current snapshot id: 163


DB clean time (in percent of total time) : 24.57 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |196|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |196|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 08:55:11 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U14445_M0 |000|SAP* |SST-LAP-LEN0028 |07:55:04|3 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 08:55:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 08:55:11 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 08:55:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2946| 47| |
|
| 1|DDLOG | 2946| 47| |
|
| 2|BTCSCHED | 5894| 49| |
|
| 3|RESTART_ALL | 1179| 285| |
|
| 4|ENVCHECK | 17685| 20| |
|
| 5|AUTOABAP | 1179| 285| |
|
| 6|BGRFC_WATCHDOG | 1180| 285| |
|
| 7|AUTOTH | 1975| 55| |
|
| 8|AUTOCCMS | 5894| 49| |
|
| 9|AUTOSECURITY | 5894| 49| |
|
| 10|LOAD_CALCULATION | 353173| 0| |
|
| 11|SPOOLALRM | 5895| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 163 (Reason: Workprocess 1 died / Time: Wed Sep 18
08:55:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-19263
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:55:11:476 2019


DpWpDynCreate: created new work process W12-19264

Wed Sep 18 08:55:13:262 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19263) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19263) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19263)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 19264) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19264) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 19264)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:55:31:471 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:55:51:472 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 08:56:11:473 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 19817

Wed Sep 18 08:56:19:761 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:56:31:474 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 19817 terminated

Wed Sep 18 08:56:51:475 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:57:11:476 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:57:31:476 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:57:51:477 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:58:11:477 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:58:31:478 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:58:51:478 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:59:11:479 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:59:31:480 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 08:59:51:480 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 09:00:11:481 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-22484
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-22485

Wed Sep 18 09:00:13:909 2019


*** ERROR => DpHdlDeadWp: W1 (pid 22484) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22484) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 22484)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 22485) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22485) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 22485)
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 09:00:31:482 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 09:00:51:482 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 08:56:11 2019, skip new
snapshot

Wed Sep 18 09:01:11:483 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 164 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:01:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:01:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Wed Sep 18 09:01:11 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10998491, readCount 10998491)


UPD : 0 (peak 31, writeCount 2421, readCount 2421)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080779, readCount 1080779)
SPO : 0 (peak 2, writeCount 12988, readCount 12988)
UP2 : 0 (peak 1, writeCount 1240, readCount 1240)
DISP: 0 (peak 67, writeCount 423556, readCount 423556)
GW : 0 (peak 45, writeCount 10019356, readCount 10019356)
ICM : 0 (peak 186, writeCount 199632, readCount 199632)
LWP : 0 (peak 15, writeCount 19558, readCount 19558)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:01:11 2019


------------------------------------------------------------

Current snapshot id: 164


DB clean time (in percent of total time) : 24.58 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |198|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |198|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:01:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:01:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:01:11 2019


------------------------------------------------------------
Current pipes in use: 219
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 09:01:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2949| 47| |
|
| 1|DDLOG | 2949| 47| |
|
| 2|BTCSCHED | 5900| 49| |
|
| 3|RESTART_ALL | 1180| 225| |
|
| 4|ENVCHECK | 17703| 20| |
|
| 5|AUTOABAP | 1180| 225| |
|
| 6|BGRFC_WATCHDOG | 1181| 225| |
|
| 7|AUTOTH | 1981| 55| |
|
| 8|AUTOCCMS | 5900| 49| |
|
| 9|AUTOSECURITY | 5900| 49| |
|
| 10|LOAD_CALCULATION | 353531| 0| |
|
| 11|SPOOLALRM | 5901| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 164 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:01:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:01:31:483 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:01:51:483 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:02:11:484 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23470

Wed Sep 18 09:02:19:234 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:02:31:484 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 23470 terminated

Wed Sep 18 09:02:51:485 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:03:11:486 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:03:31:487 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:03:51:487 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:04:11:487 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:04:31:488 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:04:51:488 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:05:11:489 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-25498
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-25499

Wed Sep 18 09:05:13:982 2019


*** ERROR => DpHdlDeadWp: W1 (pid 25498) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25498) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25498)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 25499) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25499) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 25499)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:05:31:489 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:05:51:490 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:06:11:490 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:06:31:491 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:06:51:491 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:02:11 2019, skip new
snapshot

Wed Sep 18 09:07:11:492 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
********** SERVER SNAPSHOT 165 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:07:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:07:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 09:07:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 10999769, readCount 10999769)


UPD : 0 (peak 31, writeCount 2422, readCount 2422)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080783, readCount 1080783)
SPO : 0 (peak 2, writeCount 13001, readCount 13001)
UP2 : 0 (peak 1, writeCount 1241, readCount 1241)
DISP: 0 (peak 67, writeCount 423600, readCount 423600)
GW : 0 (peak 45, writeCount 10020320, readCount 10020320)
ICM : 0 (peak 186, writeCount 199635, readCount 199635)
LWP : 0 (peak 15, writeCount 19573, readCount 19573)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:07:11 2019


------------------------------------------------------------

Current snapshot id: 165


DB clean time (in percent of total time) : 24.58 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |199|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |199|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:07:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:07:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:07:11 2019


------------------------------------------------------------
Current pipes in use: 215
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 09:07:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2952| 47| |
|
| 1|DDLOG | 2952| 47| |
|
| 2|BTCSCHED | 5906| 49| |
|
| 3|RESTART_ALL | 1181| 165| |
|
| 4|ENVCHECK | 17721| 20| |
|
| 5|AUTOABAP | 1181| 165| |
|
| 6|BGRFC_WATCHDOG | 1182| 165| |
|
| 7|AUTOTH | 1987| 55| |
|
| 8|AUTOCCMS | 5906| 49| |
|
| 9|AUTOSECURITY | 5906| 49| |
|
| 10|LOAD_CALCULATION | 353890| 1| |
|
| 11|SPOOLALRM | 5907| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 165 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:07:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:07:31:492 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:07:51:494 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:08:11:493 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 27046

Wed Sep 18 09:08:19:756 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:08:31:494 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 27046 terminated

Wed Sep 18 09:08:51:494 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:09:11:494 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:09:31:495 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
Wed Sep 18 09:09:51:496 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:10:11:497 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-28485
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-28486

Wed Sep 18 09:10:16:055 2019


*** ERROR => DpHdlDeadWp: W1 (pid 28485) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28485) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28485)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 28486) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28486) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 28486)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:10:31:497 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:10:51:498 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:11:11:499 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:11:31:499 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:11:51:499 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:12:11:500 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:12:31:501 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:12:51:502 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:08:11 2019, skip new
snapshot

Wed Sep 18 09:13:11:502 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 166 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:13:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:13:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 09:13:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 11000720, readCount 11000720)


UPD : 0 (peak 31, writeCount 2424, readCount 2424)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080791, readCount 1080791)
SPO : 0 (peak 2, writeCount 13015, readCount 13015)
UP2 : 0 (peak 1, writeCount 1243, readCount 1243)
DISP: 0 (peak 67, writeCount 423641, readCount 423641)
GW : 0 (peak 45, writeCount 10020974, readCount 10020974)
ICM : 0 (peak 186, writeCount 199642, readCount 199642)
LWP : 2 (peak 15, writeCount 19603, readCount 19601)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:13:11 2019


------------------------------------------------------------

Current snapshot id: 166


DB clean time (in percent of total time) : 24.58 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |200|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |200|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:13:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:13:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:13:11 2019


------------------------------------------------------------
Current pipes in use: 227
Current / maximal blocks in use: 0 / 1884
Periodic Tasks Wed Sep 18 09:13:11 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2955| 47| |
|
| 1|DDLOG | 2955| 47| |
|
| 2|BTCSCHED | 5912| 49| |
|
| 3|RESTART_ALL | 1182| 105| |
|
| 4|ENVCHECK | 17739| 20| |
|
| 5|AUTOABAP | 1182| 105| |
|
| 6|BGRFC_WATCHDOG | 1183| 105| |
|
| 7|AUTOTH | 1993| 55| |
|
| 8|AUTOCCMS | 5912| 49| |
|
| 9|AUTOSECURITY | 5912| 49| |
|
| 10|LOAD_CALCULATION | 354249| 1| |
|
| 11|SPOOLALRM | 5913| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 166 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:13:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:13:31:503 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:13:51:503 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:14:11:504 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 30614

Wed Sep 18 09:14:19:516 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:14:31:504 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 30614 terminated

Wed Sep 18 09:14:51:504 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:15:11:505 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-31521
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-31522

Wed Sep 18 09:15:15:994 2019


*** ERROR => DpHdlDeadWp: W1 (pid 31521) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31521) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 31521)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 31522) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31522) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 31522)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:15:31:505 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:15:51:506 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:16:11:506 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:16:31:507 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:16:51:507 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:17:11:508 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:17:31:508 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:17:51:509 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
Wed Sep 18 09:18:11:509 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:18:31:510 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:18:51:510 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:14:11 2019, skip new
snapshot

Wed Sep 18 09:19:11:511 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 167 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:19:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:19:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 09:19:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 11001598, readCount 11001598)


UPD : 0 (peak 31, writeCount 2425, readCount 2425)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080795, readCount 1080795)
SPO : 0 (peak 2, writeCount 13028, readCount 13028)
UP2 : 0 (peak 1, writeCount 1244, readCount 1244)
DISP: 0 (peak 67, writeCount 423681, readCount 423681)
GW : 0 (peak 45, writeCount 10021580, readCount 10021580)
ICM : 0 (peak 186, writeCount 199645, readCount 199645)
LWP : 2 (peak 15, writeCount 19618, readCount 19616)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:19:11 2019


------------------------------------------------------------

Current snapshot id: 167


DB clean time (in percent of total time) : 24.59 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |201|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |201|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:19:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:19:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:19:11 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 09:19:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2958| 47| |
|
| 1|DDLOG | 2958| 47| |
|
| 2|BTCSCHED | 5918| 49| |
|
| 3|RESTART_ALL | 1183| 45| |
|
| 4|ENVCHECK | 17757| 20| |
|
| 5|AUTOABAP | 1183| 45| |
|
| 6|BGRFC_WATCHDOG | 1184| 45| |
|
| 7|AUTOTH | 1999| 55| |
|
| 8|AUTOCCMS | 5918| 49| |
|
| 9|AUTOSECURITY | 5918| 49| |
|
| 10|LOAD_CALCULATION | 354608| 1| |
|
| 11|SPOOLALRM | 5919| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 167 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:19:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:19:31:512 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:19:51:512 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:20:11:513 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-1898
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-1899
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 1900

Wed Sep 18 09:20:17:037 2019


*** ERROR => DpHdlDeadWp: W1 (pid 1898) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1898) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 1898)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 1899) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=1899) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 1899)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:20:23:195 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:20:31:514 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 1900 terminated

Wed Sep 18 09:20:51:515 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:21:11:515 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:21:31:516 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:21:51:516 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:22:11:516 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:22:31:517 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:22:51:518 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:23:11:519 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:23:31:520 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:23:51:521 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:24:11:521 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:24:31:521 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:24:51:522 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:20:11 2019, skip new
snapshot

Wed Sep 18 09:25:11:523 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 168 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:25:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:25:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 09:25:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2


Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 11002940, readCount 11002940)


UPD : 0 (peak 31, writeCount 2426, readCount 2426)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080799, readCount 1080799)
SPO : 0 (peak 2, writeCount 13041, readCount 13041)
UP2 : 0 (peak 1, writeCount 1245, readCount 1245)
DISP: 0 (peak 67, writeCount 423727, readCount 423727)
GW : 0 (peak 45, writeCount 10022615, readCount 10022615)
ICM : 0 (peak 186, writeCount 199648, readCount 199648)
LWP : 2 (peak 15, writeCount 19633, readCount 19631)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:25:11 2019


------------------------------------------------------------

Current snapshot id: 168


DB clean time (in percent of total time) : 24.59 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |202|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |202|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:25:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T101_U26380_M0 |000| |SST-LAP-LEN0043 |09:24:51|5 |
SAPMSYST |high| |
|SESSION_MA| 4202|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 5 logons with 5 sessions


Total ES (gross) memory of all sessions: 29 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:25:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:25:11 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 09:25:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2961| 47| |
|
| 1|DDLOG | 2961| 47| |
|
| 2|BTCSCHED | 5924| 49| |
|
| 3|RESTART_ALL | 1185| 285| |
|
| 4|ENVCHECK | 17775| 20| |
|
| 5|AUTOABAP | 1185| 285| |
|
| 6|BGRFC_WATCHDOG | 1186| 285| |
|
| 7|AUTOTH | 2005| 55| |
|
| 8|AUTOCCMS | 5924| 49| |
|
| 9|AUTOSECURITY | 5924| 49| |
|
| 10|LOAD_CALCULATION | 354966| 0| |
|
| 11|SPOOLALRM | 5925| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 168 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:25:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W1-5028
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:25:11:537 2019


DpWpDynCreate: created new work process W12-5029

Wed Sep 18 09:25:13:657 2019


*** ERROR => DpHdlDeadWp: W1 (pid 5028) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5028) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 5028)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 5029) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5029) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5029)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:25:31:523 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:25:51:524 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:26:11:524 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 5511

Wed Sep 18 09:26:24:380 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:26:31:525 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 5511 terminated

Wed Sep 18 09:26:51:526 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:27:11:526 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:27:31:527 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:27:51:527 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:28:11:528 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:28:31:528 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:28:51:528 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:29:11:530 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:29:31:530 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:29:51:531 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:30:11:532 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-8195
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-8196

Wed Sep 18 09:30:17:778 2019


*** ERROR => DpHdlDeadWp: W1 (pid 8195) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8195) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 8195)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 8196) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=8196) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 8196)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:30:31:532 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:30:51:532 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:26:11 2019, skip new
snapshot

Wed Sep 18 09:31:11:533 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 169 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:31:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:31:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Wed Sep 18 09:31:11 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 11003816, readCount 11003816)


UPD : 0 (peak 31, writeCount 2427, readCount 2427)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080803, readCount 1080803)
SPO : 0 (peak 2, writeCount 13054, readCount 13054)
UP2 : 0 (peak 1, writeCount 1246, readCount 1246)
DISP: 0 (peak 67, writeCount 423771, readCount 423771)
GW : 0 (peak 45, writeCount 10023209, readCount 10023209)
ICM : 0 (peak 186, writeCount 199651, readCount 199651)
LWP : 0 (peak 15, writeCount 19648, readCount 19648)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:31:11 2019


------------------------------------------------------------

Current snapshot id: 169


DB clean time (in percent of total time) : 24.59 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |204|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 6|32125 |DIA |WP_RUN | | |norm|T61_U26867_M0 |HTTP_NORM| | |
0|<HANDLE PLUGIN> |000| | |
|
| 12| |BTC |WP_KILL| |204|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:31:11 2019


------------------------------------------------------------
|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |
Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T39_U26514_M0 |000| |SST-LAP-LEN0033 |09:26:22|5 |
SAPMSYST |high| |
|SESSION_MA| 4202|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|HTTP_NORMAL |T61_U26867_M0 |000| |10.54.36.33 |09:31:11|6 |
SAPMHTTP |norm| |
| | 4590|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 6 logons with 6 sessions


Total ES (gross) memory of all sessions: 33 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:31:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:31:11 2019


------------------------------------------------------------
Current pipes in use: 221
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 09:31:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2964| 47| |
|
| 1|DDLOG | 2964| 47| |
|
| 2|BTCSCHED | 5930| 49| |
|
| 3|RESTART_ALL | 1186| 225| |
|
| 4|ENVCHECK | 17793| 20| |
|
| 5|AUTOABAP | 1186| 225| |
|
| 6|BGRFC_WATCHDOG | 1187| 225| |
|
| 7|AUTOTH | 2011| 55| |
|
| 8|AUTOCCMS | 5930| 49| |
|
| 9|AUTOSECURITY | 5930| 49| |
|
| 10|LOAD_CALCULATION | 355325| 0| |
|
| 11|SPOOLALRM | 5931| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 169 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:31:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:31:31:534 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:31:51:535 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:32:11:535 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 9023

Wed Sep 18 09:32:18:804 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:32:31:536 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 9023 terminated

Wed Sep 18 09:32:51:536 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:33:11:537 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:33:31:537 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:33:51:538 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:34:11:539 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:34:31:540 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:34:51:540 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
Wed Sep 18 09:35:11:541 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-10765
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-10766

Wed Sep 18 09:35:14:488 2019


*** ERROR => DpHdlDeadWp: W1 (pid 10765) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10765) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 10765)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 10766) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=10766) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 10766)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:35:31:541 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:35:51:541 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:36:11:541 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:36:31:542 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:36:51:542 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:32:11 2019, skip new
snapshot

Wed Sep 18 09:37:11:543 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 170 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:37:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:37:11 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 09:37:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 11004722, readCount 11004722)


UPD : 0 (peak 31, writeCount 2428, readCount 2428)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080807, readCount 1080807)
SPO : 0 (peak 2, writeCount 13067, readCount 13067)
UP2 : 0 (peak 1, writeCount 1247, readCount 1247)
DISP: 0 (peak 67, writeCount 423812, readCount 423812)
GW : 0 (peak 45, writeCount 10023801, readCount 10023801)
ICM : 0 (peak 186, writeCount 199654, readCount 199654)
LWP : 0 (peak 15, writeCount 19663, readCount 19663)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:37:11 2019


------------------------------------------------------------

Current snapshot id: 170


DB clean time (in percent of total time) : 24.60 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |205|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |205|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:37:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T39_U26514_M0 |000| |SST-LAP-LEN0033 |09:26:22|5 |
SAPMSYST |high| |
|SESSION_MA| 4202|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|

Found 5 logons with 5 sessions


Total ES (gross) memory of all sessions: 29 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:37:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:37:11 2019


------------------------------------------------------------
Current pipes in use: 213
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 09:37:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2967| 47| |
|
| 1|DDLOG | 2967| 47| |
|
| 2|BTCSCHED | 5936| 49| |
|
| 3|RESTART_ALL | 1187| 165| |
|
| 4|ENVCHECK | 17811| 20| |
|
| 5|AUTOABAP | 1187| 165| |
|
| 6|BGRFC_WATCHDOG | 1188| 165| |
|
| 7|AUTOTH | 2017| 55| |
|
| 8|AUTOCCMS | 5936| 49| |
|
| 9|AUTOSECURITY | 5936| 49| |
|
| 10|LOAD_CALCULATION | 355683| 0| |
|
| 11|SPOOLALRM | 5937| 49| |
|
Found 12 periodic tasks

********** SERVER SNAPSHOT 170 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:37:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:37:31:543 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:37:51:544 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:38:11:544 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 12227

Wed Sep 18 09:38:20:369 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:38:31:544 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 12227 terminated

Wed Sep 18 09:38:51:545 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:39:11:545 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:39:31:546 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:39:51:546 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:40:11:546 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-13490
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-13491

Wed Sep 18 09:40:13:516 2019


*** ERROR => DpHdlDeadWp: W1 (pid 13490) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13490) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 13490)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W12 (pid 13491) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13491) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13491)
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:40:31:547 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:40:51:548 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:41:11:548 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:41:31:548 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:41:51:548 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:42:11:548 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:42:31:549 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:42:51:549 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:38:11 2019, skip new
snapshot

Wed Sep 18 09:43:11:549 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 171 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:43:11 2019) - begin **********

Server smprd02_SMP_00, Wed Sep 18 09:43:11 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 7, standby_wps 0
#dia = 7
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Wed Sep 18 09:43:11 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 11005596, readCount 11005596)


UPD : 0 (peak 31, writeCount 2430, readCount 2430)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1080815, readCount 1080815)
SPO : 0 (peak 2, writeCount 13081, readCount 13081)
UP2 : 0 (peak 1, writeCount 1249, readCount 1249)
DISP: 0 (peak 67, writeCount 423853, readCount 423853)
GW : 0 (peak 45, writeCount 10024371, readCount 10024371)
ICM : 0 (peak 186, writeCount 199659, readCount 199659)
LWP : 2 (peak 15, writeCount 19693, readCount 19691)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W1> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Wed Sep 18 09:43:11 2019


------------------------------------------------------------

Current snapshot id: 171


DB clean time (in percent of total time) : 24.60 %
Number of preemptions : 78

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1| |DIA |WP_KILL| |206|norm|T138_U21626_M0 |HTTP_NORM| | |
|CL_HTTP_SERVER_NET============CP |001|SM_EXTERN_WS| |
|
| 12| |BTC |WP_KILL| |206|low |T105_U21576_M0 |BATCH | | |
|SAPLSCSM_BI_DATALOAD |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 16

Session Table Wed Sep 18 09:43:11 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T14_U21459_M0 |000| |SST-LAP-LEN0033 |08:23:51|0 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T39_U26514_M0 |000| |SST-LAP-LEN0033 |09:26:22|5 |
SAPMSYST |high| |
|SESSION_MA| 4202|
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|BATCH |T105_U21576_M0 |001|SM_EFWK | |16:43:03|12 |
SMREP_PROCESS_BW_DATA_QUEUE |low | |
| | 12439|
|HTTP_NORMAL |T138_U21626_M0 |001|SM_EXTERN_WS|10.54.36.37 |16:43:09|1 |
SAPMHTTP |norm| |
| | 4590|
Found 5 logons with 5 sessions
Total ES (gross) memory of all sessions: 29 MB
Most ES (gross) memory allocated by T105_U21576_M0: 12 MB

Communication Table is empty Wed Sep 18 09:43:11 2019

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Wed Sep 18 09:43:11 2019


------------------------------------------------------------
Current pipes in use: 219
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Wed Sep 18 09:43:11 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 2970| 47| |
|
| 1|DDLOG | 2970| 47| |
|
| 2|BTCSCHED | 5942| 49| |
|
| 3|RESTART_ALL | 1188| 105| |
|
| 4|ENVCHECK | 17829| 20| |
|
| 5|AUTOABAP | 1188| 105| |
|
| 6|BGRFC_WATCHDOG | 1189| 105| |
|
| 7|AUTOTH | 2023| 55| |
|
| 8|AUTOCCMS | 5942| 49| |
|
| 9|AUTOSECURITY | 5942| 49| |
|
| 10|LOAD_CALCULATION | 356042| 0| |
|
| 11|SPOOLALRM | 5943| 49| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 171 (Reason: Workprocess 1 died / Time: Wed Sep 18
09:43:11 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Wed Sep 18 09:43:31:549 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:43:51:550 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Wed Sep 18 09:44:11:551 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22454

Wed Sep 18 09:44:18:357 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot

Wed Sep 18 09:44:31:552 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22454 terminated

Wed Sep 18 09:44:51:552 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot

Wed Sep 18 09:45:00:841 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 7->8, VB 1->1, ENQ 0->0, BTC 4->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5

Wed Sep 18 09:45:11:552 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-23287
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Wed Sep 18 09:44:11 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-23288
Wed Sep 18 09:46:37:908 2019
DpWpDynCreate: created new work process W17-26402

Wed Sep 18 09:48:55:111 2019


DpHdlDeadWp: W10 (pid=5557) terminated automatically
DpWpDynCreate: created new work process W10-598

Wed Sep 18 09:49:56:563 2019


DpHdlDeadWp: W13 (pid=4288) terminated automatically
DpWpDynCreate: created new work process W13-3359

Wed Sep 18 09:51:51:567 2019


DpWpCheck: dyn W17, pid 26402 no longer needed, terminate now

Wed Sep 18 09:51:51:683 2019


DpHdlDeadWp: W17 (pid=26402) terminated automatically

Wed Sep 18 09:52:31:300 2019


DpWpDynCreate: created new work process W19-5285

Wed Sep 18 09:52:57:027 2019


DpHdlDeadWp: W12 (pid=23288) terminated automatically
DpWpDynCreate: created new work process W12-5315

Wed Sep 18 09:54:26:149 2019


DpHdlDeadWp: W11 (pid=18210) terminated automatically
DpWpDynCreate: created new work process W11-5762

Wed Sep 18 09:57:51:578 2019


DpWpCheck: dyn W19, pid 5285 no longer needed, terminate now

Wed Sep 18 09:57:52:403 2019


DpHdlDeadWp: W19 (pid=5285) terminated automatically

Wed Sep 18 10:00:38:014 2019


DpWpDynCreate: created new work process W20-8204

Wed Sep 18 10:03:42:777 2019


DpHdlDeadWp: W12 (pid=5315) terminated automatically
DpWpDynCreate: created new work process W12-17024

Wed Sep 18 10:03:43:853 2019


DpHdlDeadWp: W11 (pid=5762) terminated automatically
DpWpDynCreate: created new work process W11-17067

Wed Sep 18 10:05:51:590 2019


DpWpCheck: dyn W20, pid 8204 no longer needed, terminate now

Wed Sep 18 10:05:51:973 2019


DpHdlDeadWp: W20 (pid=8204) terminated automatically

Wed Sep 18 10:12:13:278 2019


DpWpDynCreate: created new work process W7-8247

Wed Sep 18 10:12:13:714 2019


DpHdlDeadWp: W12 (pid=17024) terminated automatically
DpWpDynCreate: created new work process W12-8254
Wed Sep 18 10:12:43:758 2019
DpHdlDeadWp: W10 (pid=598) terminated automatically
DpWpDynCreate: created new work process W10-8478

Wed Sep 18 10:14:55:157 2019


DpWpDynCreate: created new work process W18-9140

Wed Sep 18 10:17:31:608 2019


DpWpCheck: dyn W7, pid 8247 no longer needed, terminate now

Wed Sep 18 10:17:32:198 2019


DpHdlDeadWp: W7 (pid=8247) terminated automatically

Wed Sep 18 10:18:51:611 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T54_U23819 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T54_U23819_M0 |000| |SST-LAP-LEN0028 |08:56:40|5 |
SAPMSYST |high| |
|SESSION_MA|
DpHdlSoftCancel: cancel request for T54_U23819_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Wed Sep 18 10:19:56:647 2019


DpHdlDeadWp: W18 (pid=9140) terminated automatically

Wed Sep 18 10:22:15:659 2019


DpWpDynCreate: created new work process W17-11725

Wed Sep 18 10:23:18:159 2019


DpWpDynCreate: created new work process W19-12108

Wed Sep 18 10:27:31:626 2019


DpWpCheck: dyn W17, pid 11725 no longer needed, terminate now

Wed Sep 18 10:27:32:363 2019


DpHdlDeadWp: W17 (pid=11725) terminated automatically

Wed Sep 18 10:28:31:629 2019


DpWpCheck: dyn W19, pid 12108 no longer needed, terminate now

Wed Sep 18 10:28:32:420 2019


DpHdlDeadWp: W19 (pid=12108) terminated automatically

Wed Sep 18 10:29:11:793 2019


DpWpDynCreate: created new work process W20-14078
DpWpDynCreate: created new work process W7-14079

Wed Sep 18 10:34:31:639 2019


DpWpCheck: dyn W7, pid 14079 no longer needed, terminate now
DpWpCheck: dyn W20, pid 14078 no longer needed, terminate now

Wed Sep 18 10:34:32:728 2019


DpHdlDeadWp: W7 (pid=14079) terminated automatically
DpHdlDeadWp: W20 (pid=14078) terminated automatically

Wed Sep 18 10:36:07:933 2019


DpWpDynCreate: created new work process W18-16376

Wed Sep 18 10:36:19:179 2019


DpWpDynCreate: created new work process W17-16469

Wed Sep 18 10:41:11:649 2019


DpWpCheck: dyn W18, pid 16376 no longer needed, terminate now

Wed Sep 18 10:41:12:121 2019


DpHdlDeadWp: W18 (pid=16376) terminated automatically

Wed Sep 18 10:41:24:483 2019


DpHdlDeadWp: W17 (pid=16469) terminated automatically

Wed Sep 18 10:46:02:144 2019


DpWpDynCreate: created new work process W19-19769

Wed Sep 18 10:51:04:261 2019


DpHdlDeadWp: W19 (pid=19769) terminated automatically

Wed Sep 18 10:51:18:265 2019


DpWpDynCreate: created new work process W7-21825

Wed Sep 18 10:56:19:541 2019


DpHdlDeadWp: W7 (pid=21825) terminated automatically

Wed Sep 18 10:58:06:843 2019


DpWpDynCreate: created new work process W20-24374

Wed Sep 18 11:01:02:601 2019


DpWpDynCreate: created new work process W18-25623

Wed Sep 18 11:03:08:542 2019


DpHdlDeadWp: W20 (pid=24374) terminated automatically

Wed Sep 18 11:06:04:367 2019


DpHdlDeadWp: W18 (pid=25623) terminated automatically

Wed Sep 18 11:09:48:379 2019


DpWpDynCreate: created new work process W17-23821

Wed Sep 18 11:11:44:185 2019


DpWpDynCreate: created new work process W19-24803

Wed Sep 18 11:14:51:709 2019


DpWpCheck: dyn W17, pid 23821 no longer needed, terminate now

Wed Sep 18 11:14:52:033 2019


DpHdlDeadWp: W17 (pid=23821) terminated automatically

Wed Sep 18 11:16:45:630 2019


DpHdlDeadWp: W19 (pid=24803) terminated automatically

Wed Sep 18 11:25:03:069 2019


DpWpDynCreate: created new work process W7-30104

Wed Sep 18 11:30:04:700 2019


DpHdlDeadWp: W7 (pid=30104) terminated automatically

Wed Sep 18 11:34:12:770 2019


DpWpDynCreate: created new work process W20-1095
Wed Sep 18 11:39:23:259 2019
DpHdlDeadWp: W20 (pid=1095) terminated automatically

Wed Sep 18 11:40:01:674 2019


DpWpDynCreate: created new work process W18-3375

Wed Sep 18 11:45:02:949 2019


DpHdlDeadWp: W18 (pid=3375) terminated automatically
DpWpDynCreate: created new work process W17-5480

Wed Sep 18 11:50:03:685 2019


DpHdlDeadWp: W17 (pid=5480) terminated automatically

Wed Sep 18 11:52:27:581 2019


DpWpDynCreate: created new work process W19-8317

Wed Sep 18 11:57:31:781 2019


DpWpCheck: dyn W19, pid 8317 no longer needed, terminate now

Wed Sep 18 11:57:32:121 2019


DpHdlDeadWp: W19 (pid=8317) terminated automatically

Wed Sep 18 12:00:59:662 2019


DpHdlDeadWp: W10 (pid=8478) terminated automatically
DpWpDynCreate: created new work process W10-11642

Wed Sep 18 12:03:06:648 2019


DpWpDynCreate: created new work process W7-15128

Wed Sep 18 12:08:07:694 2019


DpHdlDeadWp: W7 (pid=15128) terminated automatically

Wed Sep 18 12:13:03:701 2019


DpWpDynCreate: created new work process W20-11597

Wed Sep 18 12:13:04:189 2019


DpWpDynCreate: created new work process W18-11598

Wed Sep 18 12:18:04:434 2019


DpHdlDeadWp: W20 (pid=11597) terminated automatically

Wed Sep 18 12:18:05:683 2019


DpHdlDeadWp: W18 (pid=11598) terminated automatically

Wed Sep 18 12:24:51:827 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for T93_U933
(60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T93_U933_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |10:01:35|3 |
SAPLSBCS_OUT |high| |
|SOST |
DpHdlSoftCancel: cancel request for T93_U933_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Wed Sep 18 12:25:04:863 2019


DpWpDynCreate: created new work process W17-16057

Wed Sep 18 12:30:05:477 2019


DpHdlDeadWp: W17 (pid=16057) terminated automatically
Wed Sep 18 12:31:07:735 2019
DpWpDynCreate: created new work process W19-18257

Wed Sep 18 12:33:10:150 2019


DpWpDynCreate: created new work process W7-18939

Wed Sep 18 12:33:45:402 2019


DpWpDynCreate: created new work process W20-19175

Wed Sep 18 12:36:11:846 2019


DpWpCheck: dyn W19, pid 18257 no longer needed, terminate now

Wed Sep 18 12:36:12:752 2019


DpHdlDeadWp: W19 (pid=18257) terminated automatically

Wed Sep 18 12:38:11:456 2019


DpHdlDeadWp: W7 (pid=18939) terminated automatically

Wed Sep 18 12:38:46:537 2019


DpHdlDeadWp: W20 (pid=19175) terminated automatically

Wed Sep 18 12:43:02:073 2019


DpWpDynCreate: created new work process W18-22476

Wed Sep 18 12:48:04:201 2019


DpHdlDeadWp: W18 (pid=22476) terminated automatically

Wed Sep 18 12:49:11:868 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for T11_U216
(60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T11_U216_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |11:47:21|4 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T11_U216_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Wed Sep 18 12:56:01:432 2019


DpWpDynCreate: created new work process W17-26949

Wed Sep 18 13:01:03:353 2019


DpHdlDeadWp: W17 (pid=26949) terminated automatically

Wed Sep 18 13:02:44:110 2019


DpHdlDeadWp: W9 (pid=25625) terminated automatically
DpWpDynCreate: created new work process W9-629

Wed Sep 18 13:03:31:294 2019


DpWpDynCreate: created new work process W19-3527

Wed Sep 18 13:03:49:994 2019


DpWpDynCreate: created new work process W7-4789

Wed Sep 18 13:08:51:897 2019


DpWpCheck: dyn W7, pid 4789 no longer needed, terminate now
DpWpCheck: dyn W19, pid 3527 no longer needed, terminate now

Wed Sep 18 13:08:52:742 2019


DpHdlDeadWp: W7 (pid=4789) terminated automatically
DpHdlDeadWp: W19 (pid=3527) terminated automatically
Wed Sep 18 13:14:08:737 2019
DpWpDynCreate: created new work process W20-28843

Wed Sep 18 13:18:26:285 2019


DpWpDynCreate: created new work process W18-30480

Wed Sep 18 13:19:11:911 2019


DpWpCheck: dyn W20, pid 28843 no longer needed, terminate now

Wed Sep 18 13:19:12:246 2019


DpHdlDeadWp: W20 (pid=28843) terminated automatically

Wed Sep 18 13:23:31:919 2019


DpWpCheck: dyn W18, pid 30480 no longer needed, terminate now

Wed Sep 18 13:23:32:535 2019


DpHdlDeadWp: W18 (pid=30480) terminated automatically

Wed Sep 18 13:24:09:732 2019


DpWpDynCreate: created new work process W17-32378

Wed Sep 18 13:25:07:808 2019


DpWpDynCreate: created new work process W7-32717

Wed Sep 18 13:27:05:469 2019


DpWpDynCreate: created new work process W19-885

Wed Sep 18 13:27:05:609 2019


DpWpDynCreate: created new work process W20-886

Wed Sep 18 13:29:11:928 2019


DpWpCheck: dyn W17, pid 32378 no longer needed, terminate now

Wed Sep 18 13:29:12:815 2019


DpHdlDeadWp: W17 (pid=32378) terminated automatically

Wed Sep 18 13:30:08:625 2019


DpHdlDeadWp: W7 (pid=32717) terminated automatically

Wed Sep 18 13:32:06:970 2019


DpWpCheck: dyn W19, pid 885 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=886) terminated automatically

Wed Sep 18 13:32:07:112 2019


DpHdlDeadWp: W19 (pid=885) terminated automatically

Wed Sep 18 13:33:24:815 2019


DpWpDynCreate: created new work process W18-3342

Wed Sep 18 13:33:36:883 2019


DpWpDynCreate: created new work process W17-3430

Wed Sep 18 13:36:08:901 2019


DpWpDynCreate: created new work process W7-4151

Wed Sep 18 13:38:23:545 2019


DpWpDynCreate: created new work process W20-5176
Wed Sep 18 13:38:31:944 2019
DpWpCheck: dyn W18, pid 3342 no longer needed, terminate now

Wed Sep 18 13:38:32:343 2019


DpHdlDeadWp: W18 (pid=3342) terminated automatically

Wed Sep 18 13:38:51:944 2019


DpWpCheck: dyn W17, pid 3430 no longer needed, terminate now

Wed Sep 18 13:38:52:452 2019


DpHdlDeadWp: W17 (pid=3430) terminated automatically

Wed Sep 18 13:39:07:281 2019


DpWpDynCreate: created new work process W19-5294

Wed Sep 18 13:41:09:626 2019


DpHdlDeadWp: W7 (pid=4151) terminated automatically

Wed Sep 18 13:43:31:954 2019


DpWpCheck: dyn W20, pid 5176 no longer needed, terminate now

Wed Sep 18 13:43:32:740 2019


DpHdlDeadWp: W20 (pid=5176) terminated automatically

Wed Sep 18 13:44:05:231 2019


DpWpDynCreate: created new work process W18-7012

Wed Sep 18 13:44:11:955 2019


DpWpCheck: dyn W19, pid 5294 no longer needed, terminate now

Wed Sep 18 13:44:12:866 2019


DpHdlDeadWp: W19 (pid=5294) terminated automatically

Wed Sep 18 13:49:11:962 2019


DpWpCheck: dyn W18, pid 7012 no longer needed, terminate now

Wed Sep 18 13:49:12:289 2019


DpHdlDeadWp: W18 (pid=7012) terminated automatically

Wed Sep 18 13:50:11:298 2019


DpWpDynCreate: created new work process W17-9205

Wed Sep 18 13:55:31:977 2019


DpWpCheck: dyn W17, pid 9205 no longer needed, terminate now

Wed Sep 18 13:55:32:603 2019


DpHdlDeadWp: W17 (pid=9205) terminated automatically

Wed Sep 18 13:57:35:503 2019


DpHdlDeadWp: W10 (pid=11642) terminated automatically
DpWpDynCreate: created new work process W10-11843

Wed Sep 18 14:02:12:370 2019


DpHdlDeadWp: W10 (pid=11843) terminated automatically
DpWpDynCreate: created new work process W10-15938

Wed Sep 18 14:04:22:319 2019


DpWpDynCreate: created new work process W7-23511
Wed Sep 18 14:08:34:318 2019
DpWpDynCreate: created new work process W20-3497

Wed Sep 18 14:09:32:382 2019


DpWpCheck: dyn W7, pid 23511 no longer needed, terminate now

Wed Sep 18 14:09:32:848 2019


DpHdlDeadWp: W7 (pid=23511) terminated automatically

Wed Sep 18 14:13:36:789 2019


DpHdlDeadWp: W20 (pid=3497) terminated automatically

Wed Sep 18 14:18:45:332 2019


DpWpDynCreate: created new work process W19-14484

Wed Sep 18 14:23:52:407 2019


DpWpCheck: dyn W19, pid 14484 no longer needed, terminate now

Wed Sep 18 14:23:53:131 2019


DpHdlDeadWp: W19 (pid=14484) terminated automatically

Wed Sep 18 14:26:04:382 2019


DpWpDynCreate: created new work process W18-16810

Wed Sep 18 14:29:16:606 2019


DpWpDynCreate: created new work process W17-17909

Wed Sep 18 14:31:06:741 2019


DpHdlDeadWp: W18 (pid=16810) terminated automatically

Wed Sep 18 14:34:17:868 2019


DpWpDynCreate: created new work process W7-19326
DpWpDynCreate: created new work process W20-19327

Wed Sep 18 14:34:32:425 2019


DpWpCheck: dyn W17, pid 17909 no longer needed, terminate now

Wed Sep 18 14:34:32:895 2019


DpHdlDeadWp: W17 (pid=17909) terminated automatically

Wed Sep 18 14:39:18:880 2019


DpHdlDeadWp: W7 (pid=19326) terminated automatically
DpWpCheck: dyn W20, pid 19327 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=19327) terminated automatically

Wed Sep 18 14:43:02:061 2019


DpWpDynCreate: created new work process W19-22666

Wed Sep 18 14:48:07:939 2019


DpHdlDeadWp: W19 (pid=22666) terminated automatically

Wed Sep 18 14:48:42:853 2019


DpWpDynCreate: created new work process W18-24999

Wed Sep 18 14:53:52:456 2019


DpWpCheck: dyn W18, pid 24999 no longer needed, terminate now

Wed Sep 18 14:53:53:117 2019


DpHdlDeadWp: W18 (pid=24999) terminated automatically
Wed Sep 18 14:59:01:962 2019
DpWpDynCreate: created new work process W17-28282

Wed Sep 18 15:00:32:230 2019


DpHdlDeadWp: W9 (pid=629) terminated automatically
DpWpDynCreate: created new work process W9-28886

Wed Sep 18 15:04:02:596 2019


DpHdlDeadWp: W17 (pid=28282) terminated automatically

Wed Sep 18 15:05:40:834 2019


DpWpDynCreate: created new work process W7-13009

Wed Sep 18 15:07:03:821 2019


DpWpDynCreate: created new work process W20-16847

Wed Sep 18 15:10:52:489 2019


DpWpCheck: dyn W7, pid 13009 no longer needed, terminate now

Wed Sep 18 15:10:53:213 2019


DpHdlDeadWp: W7 (pid=13009) terminated automatically

Wed Sep 18 15:12:05:634 2019


DpHdlDeadWp: W20 (pid=16847) terminated automatically

Wed Sep 18 15:19:13:777 2019


DpWpDynCreate: created new work process W19-30553

Wed Sep 18 15:24:09:797 2019


DpWpDynCreate: created new work process W18-32406

Wed Sep 18 15:24:14:561 2019


DpHdlDeadWp: W19 (pid=30553) terminated automatically

Wed Sep 18 15:29:12:520 2019


DpWpCheck: dyn W18, pid 32406 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=32406) terminated automatically

Wed Sep 18 15:31:02:976 2019


DpWpDynCreate: created new work process W17-2162

Wed Sep 18 15:31:03:065 2019


DpWpDynCreate: created new work process W7-2166

Wed Sep 18 15:31:09:780 2019


DpWpDynCreate: created new work process W20-2202

Wed Sep 18 15:36:04:183 2019


DpWpCheck: dyn W7, pid 2166 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=2162) terminated automatically

Wed Sep 18 15:36:05:000 2019


DpHdlDeadWp: W7 (pid=2166) terminated automatically

Wed Sep 18 15:36:05:597 2019


DpWpDynCreate: created new work process W19-3957

Wed Sep 18 15:36:12:532 2019


DpWpCheck: dyn W20, pid 2202 no longer needed, terminate now

Wed Sep 18 15:36:13:275 2019


DpHdlDeadWp: W20 (pid=2202) terminated automatically

Wed Sep 18 15:41:07:267 2019


DpHdlDeadWp: W19 (pid=3957) terminated automatically

Wed Sep 18 15:41:07:938 2019


DpWpDynCreate: created new work process W18-5692

Wed Sep 18 15:43:06:814 2019


DpWpDynCreate: created new work process W17-6580

Wed Sep 18 15:46:12:550 2019


DpWpCheck: dyn W18, pid 5692 no longer needed, terminate now

Wed Sep 18 15:46:13:028 2019


DpHdlDeadWp: W18 (pid=5692) terminated automatically

Wed Sep 18 15:48:07:165 2019


DpHdlDeadWp: W17 (pid=6580) terminated automatically

Wed Sep 18 15:49:13:933 2019


DpWpDynCreate: created new work process W7-8490
DpWpDynCreate: created new work process W20-8491

Wed Sep 18 15:54:14:484 2019


DpHdlDeadWp: W7 (pid=8490) terminated automatically
DpWpCheck: dyn W20, pid 8491 no longer needed, terminate now

Wed Sep 18 15:54:15:046 2019


DpHdlDeadWp: W20 (pid=8491) terminated automatically

Wed Sep 18 15:59:14:611 2019


DpWpDynCreate: created new work process W19-11724

Wed Sep 18 16:00:36:912 2019


DpHdlDeadWp: W13 (pid=3359) terminated automatically
DpWpDynCreate: created new work process W13-12411

Wed Sep 18 16:04:16:606 2019


DpHdlDeadWp: W19 (pid=11724) terminated automatically

Wed Sep 18 16:05:38:662 2019


DpWpDynCreate: created new work process W18-29100

Wed Sep 18 16:11:38:826 2019


DpHdlDeadWp: W18 (pid=29100) terminated automatically

Wed Sep 18 16:19:21:430 2019


DpWpDynCreate: created new work process W17-14424

Wed Sep 18 16:19:21:840 2019


DpWpDynCreate: created new work process W7-14425

Wed Sep 18 16:22:06:584 2019


DpWpDynCreate: created new work process W20-15406
Wed Sep 18 16:24:22:428 2019
DpWpCheck: dyn W7, pid 14425 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=14424) terminated automatically
DpWpDynCreate: created new work process W19-16157

Wed Sep 18 16:24:22:547 2019


DpHdlDeadWp: W7 (pid=14425) terminated automatically

Wed Sep 18 16:26:02:922 2019


DpWpDynCreate: created new work process W18-16602

Wed Sep 18 16:27:12:618 2019


DpWpCheck: dyn W20, pid 15406 no longer needed, terminate now

Wed Sep 18 16:27:12:875 2019


DpHdlDeadWp: W20 (pid=15406) terminated automatically

Wed Sep 18 16:29:04:037 2019


DpWpDynCreate: created new work process W17-17475

Wed Sep 18 16:29:25:089 2019


DpHdlDeadWp: W19 (pid=16157) terminated automatically

Wed Sep 18 16:31:03:967 2019


DpHdlDeadWp: W18 (pid=16602) terminated automatically

Wed Sep 18 16:34:08:879 2019


DpHdlDeadWp: W17 (pid=17475) terminated automatically

Wed Sep 18 16:37:06:873 2019


DpWpDynCreate: created new work process W7-20123

Wed Sep 18 16:38:05:166 2019


DpWpDynCreate: created new work process W20-20386

Wed Sep 18 16:42:12:640 2019


DpWpCheck: dyn W7, pid 20123 no longer needed, terminate now
DpHdlDeadWp: W7 (pid=20123) terminated automatically

Wed Sep 18 16:43:06:718 2019


DpHdlDeadWp: W20 (pid=20386) terminated automatically

Wed Sep 18 16:49:13:901 2019


DpWpDynCreate: created new work process W19-24790

Wed Sep 18 16:52:05:636 2019


DpWpDynCreate: created new work process W18-25841

Wed Sep 18 16:54:09:762 2019


DpWpDynCreate: created new work process W17-26452

Wed Sep 18 16:54:32:655 2019


DpWpCheck: dyn W19, pid 24790 no longer needed, terminate now

Wed Sep 18 16:54:33:175 2019


DpHdlDeadWp: W19 (pid=24790) terminated automatically

Wed Sep 18 16:57:12:659 2019


DpWpCheck: dyn W18, pid 25841 no longer needed, terminate now
Wed Sep 18 16:57:13:392 2019
DpHdlDeadWp: W18 (pid=25841) terminated automatically

Wed Sep 18 16:59:10:280 2019


DpHdlDeadWp: W17 (pid=26452) terminated automatically

Wed Sep 18 17:00:04:412 2019


DpWpDynCreate: created new work process W7-28417

Wed Sep 18 17:05:06:172 2019


DpHdlDeadWp: W7 (pid=28417) terminated automatically

Wed Sep 18 17:09:11:225 2019


DpWpDynCreate: created new work process W20-26945

Wed Sep 18 17:14:12:679 2019


DpHdlDeadWp: W20 (pid=26945) terminated automatically

Wed Sep 18 17:19:19:814 2019


DpWpDynCreate: created new work process W19-30286

Wed Sep 18 17:20:58:134 2019


DpHdlDeadWp: W11 (pid=17067) terminated automatically
DpWpDynCreate: created new work process W11-30853

Wed Sep 18 17:21:27:083 2019


DpWpDynCreate: created new work process W18-31023

Wed Sep 18 17:22:34:345 2019


DpHdlDeadWp: W10 (pid=15938) terminated automatically
DpWpDynCreate: created new work process W10-31385

Wed Sep 18 17:24:20:719 2019


DpWpDynCreate: created new work process W17-31957

Wed Sep 18 17:24:21:081 2019


DpHdlDeadWp: W19 (pid=30286) terminated automatically

Wed Sep 18 17:26:32:727 2019


DpWpCheck: dyn W18, pid 31023 no longer needed, terminate now

Wed Sep 18 17:26:33:537 2019


DpHdlDeadWp: W18 (pid=31023) terminated automatically

Wed Sep 18 17:29:13:647 2019


DpWpDynCreate: created new work process W7-1218

Wed Sep 18 17:29:22:038 2019


DpHdlDeadWp: W17 (pid=31957) terminated automatically

Wed Sep 18 17:34:14:848 2019


DpWpDynCreate: created new work process W20-2929

Wed Sep 18 17:34:32:740 2019


DpWpCheck: dyn W7, pid 1218 no longer needed, terminate now

Wed Sep 18 17:34:33:059 2019


DpHdlDeadWp: W7 (pid=1218) terminated automatically
Wed Sep 18 17:35:13:952 2019
DpWpDynCreate: created new work process W19-3264

Wed Sep 18 17:38:34:104 2019


DpWpDynCreate: created new work process W18-4611

Wed Sep 18 17:39:32:750 2019


DpWpCheck: dyn W20, pid 2929 no longer needed, terminate now

Wed Sep 18 17:39:33:091 2019


DpHdlDeadWp: W20 (pid=2929) terminated automatically

Wed Sep 18 17:40:32:752 2019


DpWpCheck: dyn W19, pid 3264 no longer needed, terminate now

Wed Sep 18 17:40:33:622 2019


DpHdlDeadWp: W19 (pid=3264) terminated automatically

Wed Sep 18 17:43:49:787 2019


DpHdlDeadWp: W18 (pid=4611) terminated automatically

Wed Sep 18 17:52:03:709 2019


DpWpDynCreate: created new work process W17-8990

Wed Sep 18 17:57:08:430 2019


DpHdlDeadWp: W17 (pid=8990) terminated automatically

Wed Sep 18 17:58:02:016 2019


DpWpDynCreate: created new work process W7-11088

Wed Sep 18 17:59:02:271 2019


DpWpDynCreate: created new work process W20-11353

Wed Sep 18 18:03:03:677 2019


DpHdlDeadWp: W7 (pid=11088) terminated automatically

Wed Sep 18 18:03:29:312 2019


DpWpDynCreate: created new work process W19-22797

Wed Sep 18 18:03:53:292 2019


DpWpDynCreate: created new work process W18-24716

Wed Sep 18 18:04:12:787 2019


DpWpCheck: dyn W20, pid 11353 no longer needed, terminate now

Wed Sep 18 18:04:13:241 2019


DpHdlDeadWp: W20 (pid=11353) terminated automatically

Wed Sep 18 18:08:32:794 2019


DpWpCheck: dyn W19, pid 22797 no longer needed, terminate now

Wed Sep 18 18:08:32:986 2019


DpHdlDeadWp: W19 (pid=22797) terminated automatically

Wed Sep 18 18:09:02:166 2019


DpHdlDeadWp: W18 (pid=24716) terminated automatically

Wed Sep 18 18:09:02:850 2019


DpWpDynCreate: created new work process W17-10205

Wed Sep 18 18:14:04:339 2019


DpHdlDeadWp: W17 (pid=10205) terminated automatically

Wed Sep 18 18:19:12:568 2019


DpWpDynCreate: created new work process W7-13552

Wed Sep 18 18:19:13:019 2019


DpWpDynCreate: created new work process W20-13553

Wed Sep 18 18:24:02:335 2019


DpWpDynCreate: created new work process W19-14978

Wed Sep 18 18:24:32:819 2019


DpWpCheck: dyn W7, pid 13552 no longer needed, terminate now
DpWpCheck: dyn W20, pid 13553 no longer needed, terminate now

Wed Sep 18 18:24:33:162 2019


DpHdlDeadWp: W7 (pid=13552) terminated automatically
DpHdlDeadWp: W20 (pid=13553) terminated automatically

Wed Sep 18 18:29:03:464 2019


DpHdlDeadWp: W19 (pid=14978) terminated automatically

Wed Sep 18 18:34:02:383 2019


DpWpDynCreate: created new work process W18-18273

Wed Sep 18 18:34:10:086 2019


DpWpDynCreate: created new work process W17-18368

Wed Sep 18 18:35:11:051 2019


DpWpDynCreate: created new work process W7-18711

Wed Sep 18 18:35:11:479 2019


DpWpDynCreate: created new work process W20-18712

Wed Sep 18 18:39:03:404 2019


DpHdlDeadWp: W18 (pid=18273) terminated automatically

Wed Sep 18 18:39:12:841 2019


DpWpCheck: dyn W17, pid 18368 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=18368) terminated automatically

Wed Sep 18 18:40:07:563 2019


DpWpDynCreate: created new work process W19-20121

Wed Sep 18 18:40:12:843 2019


DpWpCheck: dyn W7, pid 18711 no longer needed, terminate now
DpWpCheck: dyn W20, pid 18712 no longer needed, terminate now

Wed Sep 18 18:40:13:080 2019


DpHdlDeadWp: W7 (pid=18711) terminated automatically
DpHdlDeadWp: W20 (pid=18712) terminated automatically

Wed Sep 18 18:43:50:817 2019


DpWpDynCreate: created new work process W18-21818

Wed Sep 18 18:45:12:851 2019


DpWpCheck: dyn W19, pid 20121 no longer needed, terminate now

Wed Sep 18 18:45:13:318 2019


DpHdlDeadWp: W19 (pid=20121) terminated automatically

Wed Sep 18 18:48:52:858 2019


DpWpCheck: dyn W18, pid 21818 no longer needed, terminate now

Wed Sep 18 18:48:53:591 2019


DpHdlDeadWp: W18 (pid=21818) terminated automatically

Wed Sep 18 18:49:13:623 2019


DpWpDynCreate: created new work process W17-23331

Wed Sep 18 18:54:21:932 2019


DpHdlDeadWp: W17 (pid=23331) terminated automatically

Wed Sep 18 18:59:02:188 2019


DpWpDynCreate: created new work process W7-26812

Wed Sep 18 18:59:41:117 2019


DpWpDynCreate: created new work process W20-27063

Wed Sep 18 19:04:06:449 2019


DpWpDynCreate: created new work process W19-28742
DpHdlDeadWp: W7 (pid=26812) terminated automatically

Wed Sep 18 19:04:52:883 2019


DpWpCheck: dyn W20, pid 27063 no longer needed, terminate now

Wed Sep 18 19:04:53:312 2019


DpHdlDeadWp: W20 (pid=27063) terminated automatically

Wed Sep 18 19:09:12:889 2019


DpWpCheck: dyn W19, pid 28742 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=28742) terminated automatically

Wed Sep 18 19:18:04:645 2019


DpWpDynCreate: created new work process W18-28986

Wed Sep 18 19:20:02:003 2019


DpWpDynCreate: created new work process W17-29528

Wed Sep 18 19:23:08:293 2019


DpHdlDeadWp: W18 (pid=28986) terminated automatically

Wed Sep 18 19:25:04:110 2019


DpHdlDeadWp: W17 (pid=29528) terminated automatically

Wed Sep 18 19:26:08:568 2019


DpWpDynCreate: created new work process W7-31492

Wed Sep 18 19:29:02:308 2019


DpWpDynCreate: created new work process W20-32439

Wed Sep 18 19:31:10:562 2019


DpHdlDeadWp: W7 (pid=31492) terminated automatically

Wed Sep 18 19:34:03:755 2019


DpHdlDeadWp: W20 (pid=32439) terminated automatically

Wed Sep 18 19:34:15:075 2019


DpWpDynCreate: created new work process W19-1584

Wed Sep 18 19:35:02:459 2019


DpWpDynCreate: created new work process W18-2018
DpWpDynCreate: created new work process W17-2019

Wed Sep 18 19:39:02:231 2019


DpWpDynCreate: created new work process W7-3613

Wed Sep 18 19:39:21:020 2019


DpHdlDeadWp: W19 (pid=1584) terminated automatically

Wed Sep 18 19:40:05:161 2019


DpWpCheck: dyn W17, pid 2019 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=2018) terminated automatically

Wed Sep 18 19:40:05:495 2019


DpHdlDeadWp: W17 (pid=2019) terminated automatically

Wed Sep 18 19:41:02:985 2019


DpHdlSoftCancel: cancel request for T138_U9407_M0 received from DISP
(reason=DP_SOFTCANCEL_AD_MSG)

Wed Sep 18 19:44:04:348 2019


DpHdlDeadWp: W7 (pid=3613) terminated automatically

Wed Sep 18 19:44:10:209 2019


DpWpDynCreate: created new work process W20-5116

Wed Sep 18 19:49:11:668 2019


DpHdlDeadWp: W20 (pid=5116) terminated automatically

Wed Sep 18 19:49:11:889 2019


DpWpDynCreate: created new work process W19-7000

Wed Sep 18 19:54:09:358 2019


DpWpDynCreate: created new work process W18-8522

Wed Sep 18 19:54:12:970 2019


DpWpCheck: dyn W19, pid 7000 no longer needed, terminate now

Wed Sep 18 19:54:13:040 2019


DpHdlDeadWp: W19 (pid=7000) terminated automatically

Wed Sep 18 19:58:04:775 2019


DpHdlDeadWp: W12 (pid=8254) terminated automatically
DpWpDynCreate: created new work process W12-9804

Wed Sep 18 19:59:14:074 2019


DpHdlDeadWp: W18 (pid=8522) terminated automatically

Wed Sep 18 20:00:57:531 2019


DpHdlDeadWp: W12 (pid=9804) terminated automatically
DpWpDynCreate: created new work process W12-10926

Wed Sep 18 20:02:23:133 2019


DpWpDynCreate: created new work process W17-13885

Wed Sep 18 20:03:28:482 2019


DpWpDynCreate: created new work process W7-16487

Wed Sep 18 20:06:12:996 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T11_U20526 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T11_U20526_M0 |001|EXT_MKARIM |SST-LAP-HP0002 |18:03:19|1 |
SAPLSDBACCMS |high| |
|DBACOCKPIT|
DpHdlSoftCancel: cancel request for T11_U20526_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Wed Sep 18 20:07:25:410 2019


DpHdlDeadWp: W17 (pid=13885) terminated automatically

Wed Sep 18 20:07:52:427 2019


DpWpDynCreate: created new work process W20-27230

Wed Sep 18 20:07:53:339 2019


DpWpDynCreate: created new work process W19-27233

Wed Sep 18 20:08:33:000 2019


DpWpCheck: dyn W7, pid 16487 no longer needed, terminate now

Wed Sep 18 20:08:33:461 2019


DpHdlDeadWp: W7 (pid=16487) terminated automatically

Wed Sep 18 20:12:53:008 2019


DpWpCheck: dyn W20, pid 27230 no longer needed, terminate now

Wed Sep 18 20:12:53:787 2019


DpHdlDeadWp: W20 (pid=27230) terminated automatically

Wed Sep 18 20:13:00:849 2019


DpHdlDeadWp: W19 (pid=27233) terminated automatically

Wed Sep 18 20:24:09:803 2019


DpWpDynCreate: created new work process W18-13925

Wed Sep 18 20:29:13:039 2019


DpWpCheck: dyn W18, pid 13925 no longer needed, terminate now

Wed Sep 18 20:29:13:202 2019


DpHdlDeadWp: W18 (pid=13925) terminated automatically

Wed Sep 18 20:29:13:516 2019


DpWpDynCreate: created new work process W17-15470

Wed Sep 18 20:34:08:852 2019


DpWpDynCreate: created new work process W7-17117

Wed Sep 18 20:34:14:396 2019


DpHdlDeadWp: W17 (pid=15470) terminated automatically

Wed Sep 18 20:39:03:964 2019


DpWpDynCreate: created new work process W20-18649
Wed Sep 18 20:39:13:056 2019
DpWpCheck: dyn W7, pid 17117 no longer needed, terminate now

Wed Sep 18 20:39:13:197 2019


DpHdlDeadWp: W7 (pid=17117) terminated automatically

Wed Sep 18 20:44:04:981 2019


DpHdlDeadWp: W20 (pid=18649) terminated automatically

Wed Sep 18 20:44:25:095 2019


DpWpDynCreate: created new work process W19-20459

Wed Sep 18 20:49:33:073 2019


DpWpCheck: dyn W19, pid 20459 no longer needed, terminate now

Wed Sep 18 20:49:33:579 2019


DpHdlDeadWp: W19 (pid=20459) terminated automatically

Wed Sep 18 20:51:02:256 2019


DpWpDynCreate: created new work process W18-22765

Wed Sep 18 20:56:04:120 2019


DpHdlDeadWp: W18 (pid=22765) terminated automatically

Wed Sep 18 20:56:08:454 2019


DpHdlDeadWp: W12 (pid=10926) terminated automatically
DpWpDynCreate: created new work process W12-24776

Wed Sep 18 21:03:26:856 2019


DpWpDynCreate: created new work process W17-3106

Wed Sep 18 21:04:05:597 2019


DpWpDynCreate: created new work process W7-6368

Wed Sep 18 21:07:25:751 2019


DpWpDynCreate: created new work process W20-16342

Wed Sep 18 21:08:33:108 2019


DpWpCheck: dyn W17, pid 3106 no longer needed, terminate now

Wed Sep 18 21:08:34:037 2019


DpHdlDeadWp: W17 (pid=3106) terminated automatically

Wed Sep 18 21:09:06:776 2019


DpHdlDeadWp: W7 (pid=6368) terminated automatically

Wed Sep 18 21:12:33:115 2019


DpWpCheck: dyn W20, pid 16342 no longer needed, terminate now

Wed Sep 18 21:12:33:315 2019


DpHdlDeadWp: W20 (pid=16342) terminated automatically

Wed Sep 18 21:14:22:951 2019


DpWpDynCreate: created new work process W19-26504

Wed Sep 18 21:16:08:561 2019


DpWpDynCreate: created new work process W18-27106

Wed Sep 18 21:18:43:225 2019


DpWpDynCreate: created new work process W17-27988

Wed Sep 18 21:19:26:076 2019


DpHdlDeadWp: W19 (pid=26504) terminated automatically

Wed Sep 18 21:20:08:717 2019


DpWpDynCreate: created new work process W7-28294

Wed Sep 18 21:21:13:129 2019


DpWpCheck: dyn W18, pid 27106 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=27106) terminated automatically

Wed Sep 18 21:23:53:133 2019


DpWpCheck: dyn W17, pid 27988 no longer needed, terminate now

Wed Sep 18 21:23:53:386 2019


DpHdlDeadWp: W17 (pid=27988) terminated automatically

Wed Sep 18 21:25:18:150 2019


DpHdlDeadWp: W7 (pid=28294) terminated automatically

Wed Sep 18 21:29:21:525 2019


DpWpDynCreate: created new work process W20-31493
DpWpDynCreate: created new work process W19-31494

Wed Sep 18 21:34:07:706 2019


DpWpDynCreate: created new work process W18-706

Wed Sep 18 21:34:25:605 2019


DpHdlDeadWp: W19 (pid=31494) terminated automatically
DpWpCheck: dyn W20, pid 31493 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=31493) terminated automatically

Wed Sep 18 21:37:02:650 2019


DpWpDynCreate: created new work process W17-1626

Wed Sep 18 21:37:03:935 2019


DpWpDynCreate: created new work process W7-1645

Wed Sep 18 21:39:13:161 2019


DpWpCheck: dyn W18, pid 706 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=706) terminated automatically

Wed Sep 18 21:42:03:529 2019


DpHdlDeadWp: W17 (pid=1626) terminated automatically

Wed Sep 18 21:42:04:454 2019


DpHdlDeadWp: W7 (pid=1645) terminated automatically
DpWpDynCreate: created new work process W19-3577

Wed Sep 18 21:47:06:040 2019


DpHdlDeadWp: W19 (pid=3577) terminated automatically

Wed Sep 18 21:49:02:419 2019


DpWpDynCreate: created new work process W20-5798

Wed Sep 18 21:54:04:141 2019


DpHdlDeadWp: W20 (pid=5798) terminated automatically
Wed Sep 18 21:54:19:343 2019
DpWpDynCreate: created new work process W18-7787

Wed Sep 18 21:59:29:900 2019


DpHdlDeadWp: W18 (pid=7787) terminated automatically

Wed Sep 18 22:03:29:681 2019


DpWpDynCreate: created new work process W17-18620

Wed Sep 18 22:07:37:671 2019


DpWpDynCreate: created new work process W7-616

Wed Sep 18 22:08:33:222 2019


DpWpCheck: dyn W17, pid 18620 no longer needed, terminate now

Wed Sep 18 22:08:33:326 2019


DpHdlDeadWp: W17 (pid=18620) terminated automatically

Wed Sep 18 22:12:53:230 2019


DpWpCheck: dyn W7, pid 616 no longer needed, terminate now

Wed Sep 18 22:12:53:622 2019


DpHdlDeadWp: W7 (pid=616) terminated automatically

Wed Sep 18 22:20:12:871 2019


DpWpDynCreate: created new work process W19-11768

Wed Sep 18 22:23:01:578 2019


DpWpDynCreate: created new work process W20-12518

Wed Sep 18 22:24:02:137 2019


DpWpDynCreate: created new work process W18-12795

Wed Sep 18 22:25:13:254 2019


DpWpCheck: dyn W19, pid 11768 no longer needed, terminate now

Wed Sep 18 22:25:13:380 2019


DpHdlDeadWp: W19 (pid=11768) terminated automatically

Wed Sep 18 22:28:08:386 2019


DpHdlDeadWp: W20 (pid=12518) terminated automatically

Wed Sep 18 22:29:03:463 2019


DpHdlDeadWp: W18 (pid=12795) terminated automatically

Wed Sep 18 22:30:02:835 2019


DpWpDynCreate: created new work process W17-14843

Wed Sep 18 22:34:07:690 2019


DpWpDynCreate: created new work process W7-15993

Wed Sep 18 22:35:03:825 2019


DpHdlDeadWp: W17 (pid=14843) terminated automatically

Wed Sep 18 22:38:15:432 2019


DpWpDynCreate: created new work process W19-17531

Wed Sep 18 22:39:13:279 2019


DpWpCheck: dyn W7, pid 15993 no longer needed, terminate now
Wed Sep 18 22:39:14:016 2019
DpHdlDeadWp: W7 (pid=15993) terminated automatically

Wed Sep 18 22:42:09:548 2019


DpWpDynCreate: created new work process W20-18782

Wed Sep 18 22:42:10:365 2019


DpWpDynCreate: created new work process W18-18788

Wed Sep 18 22:43:33:289 2019


DpWpCheck: dyn W19, pid 17531 no longer needed, terminate now

Wed Sep 18 22:43:34:335 2019


DpHdlDeadWp: W19 (pid=17531) terminated automatically

Wed Sep 18 22:47:12:306 2019


DpHdlDeadWp: W18 (pid=18788) terminated automatically
DpHdlDeadWp: W20 (pid=18782) terminated automatically

Wed Sep 18 22:59:05:609 2019


DpWpDynCreate: created new work process W17-24699

Wed Sep 18 23:04:08:248 2019


DpHdlDeadWp: W17 (pid=24699) terminated automatically

Wed Sep 18 23:04:09:849 2019


DpWpDynCreate: created new work process W7-4749

Wed Sep 18 23:09:13:333 2019


DpWpCheck: dyn W7, pid 4749 no longer needed, terminate now
DpHdlDeadWp: W7 (pid=4749) terminated automatically

Wed Sep 18 23:09:45:356 2019


DpWpDynCreate: created new work process W19-23697

Wed Sep 18 23:14:03:045 2019


DpWpDynCreate: created new work process W18-25313

Wed Sep 18 23:14:53:343 2019


DpWpCheck: dyn W19, pid 23697 no longer needed, terminate now

Wed Sep 18 23:14:53:487 2019


DpHdlDeadWp: W19 (pid=23697) terminated automatically

Wed Sep 18 23:19:13:350 2019


DpWpCheck: dyn W18, pid 25313 no longer needed, terminate now

Wed Sep 18 23:19:13:528 2019


DpHdlDeadWp: W18 (pid=25313) terminated automatically

Wed Sep 18 23:20:01:849 2019


DpWpDynCreate: created new work process W20-27316

Wed Sep 18 23:20:02:004 2019


DpWpDynCreate: created new work process W17-27319

Wed Sep 18 23:25:02:857 2019


DpHdlDeadWp: W20 (pid=27316) terminated automatically
Wed Sep 18 23:25:03:599 2019
DpHdlDeadWp: W17 (pid=27319) terminated automatically

Wed Sep 18 23:29:09:176 2019


DpWpDynCreate: created new work process W7-30341

Wed Sep 18 23:29:13:264 2019


DpWpDynCreate: created new work process W19-30358

Wed Sep 18 23:31:27:484 2019


DpWpDynCreate: created new work process W18-31128

Wed Sep 18 23:34:10:558 2019


DpHdlDeadWp: W7 (pid=30341) terminated automatically

Wed Sep 18 23:34:18:606 2019


DpHdlDeadWp: W19 (pid=30358) terminated automatically

Wed Sep 18 23:34:18:957 2019


DpWpDynCreate: created new work process W20-32209

Wed Sep 18 23:34:40:561 2019


DpWpDynCreate: created new work process W17-32504

Wed Sep 18 23:36:33:378 2019


DpWpCheck: dyn W18, pid 31128 no longer needed, terminate now

Wed Sep 18 23:36:33:723 2019


DpHdlDeadWp: W18 (pid=31128) terminated automatically

Wed Sep 18 23:39:19:678 2019


DpHdlDeadWp: W20 (pid=32209) terminated automatically

Wed Sep 18 23:39:53:384 2019


DpWpCheck: dyn W17, pid 32504 no longer needed, terminate now

Wed Sep 18 23:39:53:756 2019


DpHdlDeadWp: W17 (pid=32504) terminated automatically

Wed Sep 18 23:43:15:949 2019


DpWpDynCreate: created new work process W7-3014

Wed Sep 18 23:44:04:315 2019


DpWpDynCreate: created new work process W19-3256

Wed Sep 18 23:44:12:684 2019


DpWpDynCreate: created new work process W18-3280

Wed Sep 18 23:48:33:399 2019


DpWpCheck: dyn W7, pid 3014 no longer needed, terminate now

Wed Sep 18 23:48:33:849 2019


DpHdlDeadWp: W7 (pid=3014) terminated automatically

Wed Sep 18 23:49:02:535 2019


DpWpDynCreate: created new work process W20-4783

Wed Sep 18 23:49:06:620 2019


DpHdlDeadWp: W19 (pid=3256) terminated automatically

Wed Sep 18 23:49:14:292 2019


DpHdlDeadWp: W18 (pid=3280) terminated automatically

Wed Sep 18 23:54:03:604 2019


DpHdlDeadWp: W20 (pid=4783) terminated automatically

Wed Sep 18 23:54:08:199 2019


DpWpDynCreate: created new work process W17-6749

Wed Sep 18 23:59:07:728 2019


DpWpDynCreate: created new work process W7-8188

Wed Sep 18 23:59:10:060 2019


DpHdlDeadWp: W17 (pid=6749) terminated automatically

Thu Sep 19 00:00:00:862 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 9->8, VB 1->1, ENQ 0->0, BTC 5->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5
DpWpConf: wp reconfiguration, stop W16, pid 404
DpAdaptWppriv_max_no : 4 -> 4

Thu Sep 19 00:00:01:390 2019


DpHdlDeadWp: W16 (pid=404) terminated automatically

Thu Sep 19 00:03:42:181 2019


DpWpDynCreate: created new work process W19-14116

Thu Sep 19 00:03:46:634 2019


DpWpDynCreate: created new work process W18-14338

Thu Sep 19 00:08:48:653 2019


DpHdlDeadWp: W18 (pid=14338) terminated automatically
DpWpCheck: dyn W19, pid 14116 no longer needed, terminate now

Thu Sep 19 00:08:48:928 2019


DpHdlDeadWp: W19 (pid=14116) terminated automatically

Thu Sep 19 00:13:04:486 2019


DpWpDynCreate: created new work process W20-9159

Thu Sep 19 00:13:04:900 2019


DpWpDynCreate: created new work process W17-9160

Thu Sep 19 00:18:06:400 2019


DpHdlDeadWp: W17 (pid=9160) terminated automatically
DpWpCheck: dyn W20, pid 9159 no longer needed, terminate now

Thu Sep 19 00:18:07:502 2019


DpHdlDeadWp: W20 (pid=9159) terminated automatically

Thu Sep 19 00:19:02:371 2019


DpWpDynCreate: created new work process W16-17948

Thu Sep 19 00:19:14:346 2019


DpWpDynCreate: created new work process W18-18270
Thu Sep 19 00:24:03:906 2019
DpHdlDeadWp: W16 (pid=17948) terminated automatically

Thu Sep 19 00:24:15:671 2019


DpHdlDeadWp: W18 (pid=18270) terminated automatically

Thu Sep 19 00:28:02:614 2019


DpWpDynCreate: created new work process W19-27825

Thu Sep 19 00:29:09:145 2019


DpWpDynCreate: created new work process W17-29287

Thu Sep 19 00:33:13:478 2019


DpWpCheck: dyn W19, pid 27825 no longer needed, terminate now

Thu Sep 19 00:33:14:473 2019


DpHdlDeadWp: W19 (pid=27825) terminated automatically

Thu Sep 19 00:34:13:481 2019


DpWpCheck: dyn W17, pid 29287 no longer needed, terminate now

Thu Sep 19 00:34:14:502 2019


DpHdlDeadWp: W17 (pid=29287) terminated automatically

Thu Sep 19 00:50:08:321 2019


DpWpDynCreate: created new work process W20-11469

Thu Sep 19 00:54:43:939 2019


DpWpDynCreate: created new work process W16-13066

Thu Sep 19 00:55:13:524 2019


DpWpCheck: dyn W20, pid 11469 no longer needed, terminate now

Thu Sep 19 00:55:14:686 2019


DpHdlDeadWp: W20 (pid=11469) terminated automatically

Thu Sep 19 00:59:47:943 2019


DpHdlDeadWp: W16 (pid=13066) terminated automatically

Thu Sep 19 01:00:19:596 2019


DpWpDynCreate: created new work process W18-25777

Thu Sep 19 01:05:33:539 2019


DpWpCheck: dyn W18, pid 25777 no longer needed, terminate now

Thu Sep 19 01:05:33:937 2019


DpHdlDeadWp: W18 (pid=25777) terminated automatically

Thu Sep 19 01:05:38:494 2019


DpWpDynCreate: created new work process W19-6112

Thu Sep 19 01:09:26:602 2019


DpHdlDeadWp: W13 (pid=12411) terminated automatically
DpWpDynCreate: created new work process W13-15004

Thu Sep 19 01:10:02:301 2019


DpWpDynCreate: created new work process W17-15379

Thu Sep 19 01:10:53:547 2019


DpWpCheck: dyn W19, pid 6112 no longer needed, terminate now

Thu Sep 19 01:10:53:696 2019


DpHdlDeadWp: W19 (pid=6112) terminated automatically

Thu Sep 19 01:12:36:469 2019


DpWpDynCreate: created new work process W20-27609

Thu Sep 19 01:15:04:270 2019


DpHdlDeadWp: W17 (pid=15379) terminated automatically

Thu Sep 19 01:16:03:487 2019


DpWpDynCreate: created new work process W16-28465

Thu Sep 19 01:17:53:555 2019


DpWpCheck: dyn W20, pid 27609 no longer needed, terminate now

Thu Sep 19 01:17:54:485 2019


DpHdlDeadWp: W20 (pid=27609) terminated automatically

Thu Sep 19 01:18:42:937 2019


DpWpDynCreate: created new work process W18-29557

Thu Sep 19 01:21:13:562 2019


DpWpCheck: dyn W16, pid 28465 no longer needed, terminate now

Thu Sep 19 01:21:14:646 2019


DpHdlDeadWp: W16 (pid=28465) terminated automatically

Thu Sep 19 01:23:53:566 2019


DpWpCheck: dyn W18, pid 29557 no longer needed, terminate now

Thu Sep 19 01:23:53:850 2019


DpHdlDeadWp: W18 (pid=29557) terminated automatically

Thu Sep 19 01:26:01:803 2019


DpWpDynCreate: created new work process W19-31752

Thu Sep 19 01:30:17:279 2019


DpWpDynCreate: created new work process W17-763

Thu Sep 19 01:31:02:640 2019


DpHdlDeadWp: W19 (pid=31752) terminated automatically

Thu Sep 19 01:35:33:585 2019


DpWpCheck: dyn W17, pid 763 no longer needed, terminate now

Thu Sep 19 01:35:34:471 2019


DpHdlDeadWp: W17 (pid=763) terminated automatically

Thu Sep 19 01:36:19:097 2019


DpWpDynCreate: created new work process W20-3042

Thu Sep 19 01:40:41:705 2019


DpHdlDeadWp: W13 (pid=15004) terminated automatically
DpWpDynCreate: created new work process W13-4448

Thu Sep 19 01:41:33:594 2019


DpWpCheck: dyn W20, pid 3042 no longer needed, terminate now
Thu Sep 19 01:41:33:828 2019
DpHdlDeadWp: W20 (pid=3042) terminated automatically

Thu Sep 19 01:44:05:825 2019


DpWpDynCreate: created new work process W16-5536

Thu Sep 19 01:49:13:606 2019


DpWpCheck: dyn W16, pid 5536 no longer needed, terminate now

Thu Sep 19 01:49:14:251 2019


DpHdlDeadWp: W16 (pid=5536) terminated automatically

Thu Sep 19 01:50:04:961 2019


DpWpDynCreate: created new work process W18-7504

Thu Sep 19 01:55:13:616 2019


DpWpCheck: dyn W18, pid 7504 no longer needed, terminate now

Thu Sep 19 01:55:14:616 2019


DpHdlDeadWp: W18 (pid=7504) terminated automatically

Thu Sep 19 01:56:02:358 2019


DpWpDynCreate: created new work process W19-9557

Thu Sep 19 02:01:13:629 2019


DpWpCheck: dyn W19, pid 9557 no longer needed, terminate now

Thu Sep 19 02:01:14:001 2019


DpHdlDeadWp: W19 (pid=9557) terminated automatically

Thu Sep 19 02:18:10:497 2019


DpWpDynCreate: created new work process W17-12131

Thu Sep 19 02:23:11:508 2019


DpHdlDeadWp: W17 (pid=12131) terminated automatically

Thu Sep 19 02:26:02:896 2019


DpWpDynCreate: created new work process W20-14807

Thu Sep 19 02:31:05:090 2019


DpHdlDeadWp: W20 (pid=14807) terminated automatically

Thu Sep 19 02:37:34:776 2019


DpWpDynCreate: created new work process W16-18375

Thu Sep 19 02:37:41:769 2019


DpWpDynCreate: created new work process W18-18426

Thu Sep 19 02:42:36:651 2019


DpHdlDeadWp: W16 (pid=18375) terminated automatically

Thu Sep 19 02:42:48:141 2019


DpHdlDeadWp: W18 (pid=18426) terminated automatically

Thu Sep 19 02:43:36:456 2019


DpWpDynCreate: created new work process W19-20278

Thu Sep 19 02:48:38:077 2019


DpHdlDeadWp: W19 (pid=20278) terminated automatically

Thu Sep 19 02:53:19:080 2019


DpWpDynCreate: created new work process W17-23570

Thu Sep 19 02:58:33:760 2019


DpWpCheck: dyn W17, pid 23570 no longer needed, terminate now

Thu Sep 19 02:58:34:553 2019


DpHdlDeadWp: W17 (pid=23570) terminated automatically

Thu Sep 19 03:12:01:936 2019


DpWpDynCreate: created new work process W20-25646

Thu Sep 19 03:17:03:501 2019


DpHdlDeadWp: W20 (pid=25646) terminated automatically

Thu Sep 19 03:20:04:424 2019


DpWpDynCreate: created new work process W16-28404

Thu Sep 19 03:25:05:527 2019


DpHdlDeadWp: W16 (pid=28404) terminated automatically

Thu Sep 19 03:35:22:455 2019


DpWpDynCreate: created new work process W18-674

Thu Sep 19 03:40:25:770 2019


DpHdlDeadWp: W18 (pid=674) terminated automatically

Thu Sep 19 03:43:03:553 2019


DpWpDynCreate: created new work process W19-3260

Thu Sep 19 03:45:21:346 2019


DpWpDynCreate: created new work process W17-4049

Thu Sep 19 03:48:14:040 2019


DpHdlDeadWp: W19 (pid=3260) terminated automatically

Thu Sep 19 03:50:23:090 2019


DpHdlDeadWp: W17 (pid=4049) terminated automatically

Thu Sep 19 03:55:22:810 2019


DpWpDynCreate: created new work process W20-7630

Thu Sep 19 04:00:22:648 2019


DpWpDynCreate: created new work process W16-9445

Thu Sep 19 04:00:23:545 2019


DpHdlDeadWp: W20 (pid=7630) terminated automatically

Thu Sep 19 04:05:23:793 2019


DpHdlDeadWp: W16 (pid=9445) terminated automatically

Thu Sep 19 04:13:06:784 2019


DpWpDynCreate: created new work process W18-9375

Thu Sep 19 04:15:04:252 2019


DpWpDynCreate: created new work process W19-9979
Thu Sep 19 04:18:13:903 2019
DpWpCheck: dyn W18, pid 9375 no longer needed, terminate now

Thu Sep 19 04:18:14:437 2019


DpHdlDeadWp: W18 (pid=9375) terminated automatically

Thu Sep 19 04:20:10:533 2019


DpHdlDeadWp: W19 (pid=9979) terminated automatically

Thu Sep 19 04:20:19:648 2019


DpWpDynCreate: created new work process W17-11692

Thu Sep 19 04:25:20:533 2019


DpHdlDeadWp: W17 (pid=11692) terminated automatically

Thu Sep 19 04:30:03:828 2019


DpWpDynCreate: created new work process W20-15541

Thu Sep 19 04:30:07:871 2019


DpWpDynCreate: created new work process W16-15548

Thu Sep 19 04:35:04:673 2019


DpHdlDeadWp: W20 (pid=15541) terminated automatically

Thu Sep 19 04:35:13:928 2019


DpWpCheck: dyn W16, pid 15548 no longer needed, terminate now

Thu Sep 19 04:35:14:080 2019


DpHdlDeadWp: W16 (pid=15548) terminated automatically

Thu Sep 19 04:35:26:558 2019


DpWpDynCreate: created new work process W18-17670

Thu Sep 19 04:40:33:937 2019


DpWpCheck: dyn W18, pid 17670 no longer needed, terminate now

Thu Sep 19 04:40:34:438 2019


DpHdlDeadWp: W18 (pid=17670) terminated automatically

Thu Sep 19 04:41:03:910 2019


DpWpDynCreate: created new work process W19-19298

Thu Sep 19 04:46:05:180 2019


DpHdlDeadWp: W19 (pid=19298) terminated automatically

Thu Sep 19 04:49:03:632 2019


DpWpDynCreate: created new work process W17-22098

Thu Sep 19 04:54:05:078 2019


DpHdlDeadWp: W17 (pid=22098) terminated automatically

Thu Sep 19 05:02:07:412 2019


DpWpDynCreate: created new work process W20-29822

Thu Sep 19 05:02:14:597 2019


DpWpDynCreate: created new work process W16-30216

Thu Sep 19 05:07:09:538 2019


DpHdlDeadWp: W20 (pid=29822) terminated automatically
Thu Sep 19 05:07:33:980 2019
DpWpCheck: dyn W16, pid 30216 no longer needed, terminate now

Thu Sep 19 05:07:34:573 2019


DpHdlDeadWp: W16 (pid=30216) terminated automatically

Thu Sep 19 05:10:15:998 2019


DpWpDynCreate: created new work process W18-26614

Thu Sep 19 05:11:02:795 2019


DpWpDynCreate: created new work process W19-26897

Thu Sep 19 05:15:33:994 2019


DpWpCheck: dyn W18, pid 26614 no longer needed, terminate now

Thu Sep 19 05:15:34:129 2019


DpHdlDeadWp: W18 (pid=26614) terminated automatically

Thu Sep 19 05:16:03:122 2019


DpHdlDeadWp: W19 (pid=26897) terminated automatically

Thu Sep 19 05:23:09:739 2019


DpWpDynCreate: created new work process W17-30890

Thu Sep 19 05:23:11:398 2019


DpWpDynCreate: created new work process W20-30893

Thu Sep 19 05:23:11:517 2019


DpWpDynCreate: created new work process W16-30894

Thu Sep 19 05:28:14:014 2019


DpWpCheck: dyn W16, pid 30894 no longer needed, terminate now
DpWpCheck: dyn W17, pid 30890 no longer needed, terminate now
DpWpCheck: dyn W20, pid 30893 no longer needed, terminate now

Thu Sep 19 05:28:14:809 2019


DpHdlDeadWp: W16 (pid=30894) terminated automatically
DpHdlDeadWp: W17 (pid=30890) terminated automatically
DpHdlDeadWp: W20 (pid=30893) terminated automatically

Thu Sep 19 05:41:25:314 2019


DpWpDynCreate: created new work process W18-4699

Thu Sep 19 05:46:34:043 2019


DpWpCheck: dyn W18, pid 4699 no longer needed, terminate now

Thu Sep 19 05:46:34:744 2019


DpHdlDeadWp: W18 (pid=4699) terminated automatically

Thu Sep 19 05:47:23:786 2019


DpWpDynCreate: created new work process W19-6799

Thu Sep 19 05:52:34:054 2019


DpWpCheck: dyn W19, pid 6799 no longer needed, terminate now

Thu Sep 19 05:52:34:215 2019


DpHdlDeadWp: W19 (pid=6799) terminated automatically
Thu Sep 19 05:56:10:814 2019
DpWpDynCreate: created new work process W16-10080

Thu Sep 19 06:01:14:070 2019


DpWpCheck: dyn W16, pid 10080 no longer needed, terminate now

Thu Sep 19 06:01:14:318 2019


DpHdlDeadWp: W16 (pid=10080) terminated automatically

Thu Sep 19 06:12:54:092 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T87_U22729 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T87_U22729_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |05:31:31|2 |
SAPLSDBACCMS |high| |
|DBACOCKPIT|
DpHdlSoftCancel: cancel request for T87_U22729_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Thu Sep 19 06:21:19:352 2019


DpWpDynCreate: created new work process W17-14145

Thu Sep 19 06:26:20:683 2019


DpHdlDeadWp: W17 (pid=14145) terminated automatically

Thu Sep 19 06:28:03:743 2019


DpWpDynCreate: created new work process W20-16396

Thu Sep 19 06:31:02:975 2019


DpWpDynCreate: created new work process W18-17433

Thu Sep 19 06:33:04:891 2019


DpHdlDeadWp: W20 (pid=16396) terminated automatically

Thu Sep 19 06:36:03:202 2019


DpHdlDeadWp: W18 (pid=17433) terminated automatically

Thu Sep 19 06:41:33:757 2019


DpWpDynCreate: created new work process W19-20868

Thu Sep 19 06:46:34:152 2019


DpWpCheck: dyn W19, pid 20868 no longer needed, terminate now

Thu Sep 19 06:46:34:887 2019


DpHdlDeadWp: W19 (pid=20868) terminated automatically

Thu Sep 19 06:48:02:816 2019


DpWpDynCreate: created new work process W16-23010

Thu Sep 19 06:48:03:035 2019


DpWpDynCreate: created new work process W17-23011

Thu Sep 19 06:53:03:450 2019


DpHdlDeadWp: W16 (pid=23010) terminated automatically

Thu Sep 19 06:53:08:773 2019


DpHdlDeadWp: W17 (pid=23011) terminated automatically

Thu Sep 19 06:56:02:742 2019


DpWpDynCreate: created new work process W20-25855
Thu Sep 19 07:01:04:644 2019
DpHdlDeadWp: W20 (pid=25855) terminated automatically

Thu Sep 19 07:01:28:091 2019


DpWpDynCreate: created new work process W18-27626

Thu Sep 19 07:01:33:308 2019


DpWpDynCreate: created new work process W19-27849

Thu Sep 19 07:06:31:230 2019


DpHdlDeadWp: W18 (pid=27626) terminated automatically

Thu Sep 19 07:06:34:184 2019


DpWpCheck: dyn W19, pid 27849 no longer needed, terminate now

Thu Sep 19 07:06:35:288 2019


DpHdlDeadWp: W19 (pid=27849) terminated automatically

Thu Sep 19 07:26:12:468 2019


DpWpDynCreate: created new work process W16-31378

Thu Sep 19 07:31:14:119 2019


DpHdlDeadWp: W16 (pid=31378) terminated automatically

Thu Sep 19 07:31:18:991 2019


DpWpDynCreate: created new work process W17-514

Thu Sep 19 07:36:08:826 2019


DpWpDynCreate: created new work process W20-2159

Thu Sep 19 07:36:21:240 2019


DpHdlDeadWp: W17 (pid=514) terminated automatically

Thu Sep 19 07:41:02:795 2019


DpWpDynCreate: created new work process W18-3855

Thu Sep 19 07:41:04:067 2019


DpWpDynCreate: created new work process W19-3860

Thu Sep 19 07:41:04:467 2019


DpWpDynCreate: created new work process W16-3861

Thu Sep 19 07:41:09:807 2019


DpHdlDeadWp: W20 (pid=2159) terminated automatically

Thu Sep 19 07:46:03:232 2019


DpHdlDeadWp: W18 (pid=3855) terminated automatically

Thu Sep 19 07:46:07:742 2019


DpWpCheck: dyn W16, pid 3861 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=3860) terminated automatically
DpHdlDeadWp: W16 (pid=3861) terminated automatically

Thu Sep 19 07:46:12:073 2019


DpWpDynCreate: created new work process W17-5609

Thu Sep 19 07:51:14:255 2019


DpWpCheck: dyn W17, pid 5609 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=5609) terminated automatically

Thu Sep 19 07:55:06:837 2019


DpHdlDeadWp: W12 (pid=24776) terminated automatically
DpWpDynCreate: created new work process W12-8833

Thu Sep 19 07:56:14:832 2019


DpWpDynCreate: created new work process W20-9148

Thu Sep 19 08:01:15:672 2019


DpHdlDeadWp: W20 (pid=9148) terminated automatically

Thu Sep 19 08:01:19:094 2019


DpWpDynCreate: created new work process W18-10846

Thu Sep 19 08:06:08:453 2019


DpWpDynCreate: created new work process W19-28165

Thu Sep 19 08:06:11:829 2019


DpWpDynCreate: created new work process W16-28471

Thu Sep 19 08:06:34:282 2019


DpWpCheck: dyn W18, pid 10846 no longer needed, terminate now

Thu Sep 19 08:06:34:589 2019


DpHdlDeadWp: W18 (pid=10846) terminated automatically

Thu Sep 19 08:11:09:843 2019


DpHdlDeadWp: W19 (pid=28165) terminated automatically

Thu Sep 19 08:11:14:291 2019


DpWpCheck: dyn W16, pid 28471 no longer needed, terminate now

Thu Sep 19 08:11:14:420 2019


DpHdlDeadWp: W16 (pid=28471) terminated automatically

Thu Sep 19 08:15:02:946 2019


DpWpDynCreate: created new work process W17-11055

Thu Sep 19 08:15:49:663 2019


DpWpDynCreate: created new work process W20-11331

Thu Sep 19 08:18:40:730 2019


DpWpDynCreate: created new work process W18-12285

Thu Sep 19 08:20:03:575 2019


DpHdlDeadWp: W17 (pid=11055) terminated automatically

Thu Sep 19 08:20:54:307 2019


DpWpCheck: dyn W20, pid 11331 no longer needed, terminate now

Thu Sep 19 08:20:54:703 2019


DpHdlDeadWp: W20 (pid=11331) terminated automatically

Thu Sep 19 08:21:15:336 2019


DpWpDynCreate: created new work process W19-12986

Thu Sep 19 08:23:54:311 2019


DpWpCheck: dyn W18, pid 12285 no longer needed, terminate now
Thu Sep 19 08:23:54:842 2019
DpHdlDeadWp: W18 (pid=12285) terminated automatically

Thu Sep 19 08:26:16:488 2019


DpHdlDeadWp: W19 (pid=12986) terminated automatically

Thu Sep 19 08:28:12:273 2019


DpWpDynCreate: created new work process W16-16024

Thu Sep 19 08:31:19:662 2019


DpWpDynCreate: created new work process W17-16969

Thu Sep 19 08:33:14:326 2019


DpWpCheck: dyn W16, pid 16024 no longer needed, terminate now

Thu Sep 19 08:33:14:738 2019


DpHdlDeadWp: W16 (pid=16024) terminated automatically

Thu Sep 19 08:36:13:660 2019


DpWpDynCreate: created new work process W20-18487

Thu Sep 19 08:36:20:141 2019


DpHdlDeadWp: W17 (pid=16969) terminated automatically

Thu Sep 19 08:41:02:806 2019


DpWpDynCreate: created new work process W18-20007

Thu Sep 19 08:41:14:171 2019


DpHdlDeadWp: W20 (pid=18487) terminated automatically

Thu Sep 19 08:46:04:050 2019


DpHdlDeadWp: W18 (pid=20007) terminated automatically

Thu Sep 19 08:46:04:707 2019


DpWpDynCreate: created new work process W19-21795

Thu Sep 19 08:51:06:011 2019


DpHdlDeadWp: W19 (pid=21795) terminated automatically

Thu Sep 19 08:51:11:448 2019


DpWpDynCreate: created new work process W16-23336

Thu Sep 19 08:56:14:032 2019


DpHdlDeadWp: W16 (pid=23336) terminated automatically

Thu Sep 19 09:00:42:775 2019


DpWpDynCreate: created new work process W17-26787

Thu Sep 19 09:02:05:558 2019


DpWpDynCreate: created new work process W20-29960

Thu Sep 19 09:05:45:464 2019


DpHdlDeadWp: W17 (pid=26787) terminated automatically

Thu Sep 19 09:06:53:934 2019


DpWpDynCreate: created new work process W18-14628

Thu Sep 19 09:07:03:990 2019


DpWpDynCreate: created new work process W19-15402

Thu Sep 19 09:07:14:382 2019


DpWpCheck: dyn W20, pid 29960 no longer needed, terminate now

Thu Sep 19 09:07:14:563 2019


DpHdlDeadWp: W20 (pid=29960) terminated automatically

Thu Sep 19 09:11:54:753 2019


DpHdlDeadWp: W18 (pid=14628) terminated automatically

Thu Sep 19 09:12:04:647 2019


DpHdlDeadWp: W19 (pid=15402) terminated automatically

Thu Sep 19 09:18:04:569 2019


DpWpDynCreate: created new work process W16-28073

Thu Sep 19 09:21:02:811 2019


DpWpDynCreate: created new work process W17-29153

Thu Sep 19 09:23:14:404 2019


DpWpCheck: dyn W16, pid 28073 no longer needed, terminate now

Thu Sep 19 09:23:15:016 2019


DpHdlDeadWp: W16 (pid=28073) terminated automatically

Thu Sep 19 09:26:03:138 2019


DpWpDynCreate: created new work process W20-30700

Thu Sep 19 09:26:04:536 2019


DpHdlDeadWp: W17 (pid=29153) terminated automatically

Thu Sep 19 09:31:14:415 2019


DpWpCheck: dyn W20, pid 30700 no longer needed, terminate now

Thu Sep 19 09:31:14:825 2019


DpHdlDeadWp: W20 (pid=30700) terminated automatically

Thu Sep 19 09:33:44:150 2019


DpWpDynCreate: created new work process W18-983

Thu Sep 19 09:38:54:428 2019


DpWpCheck: dyn W18, pid 983 no longer needed, terminate now

Thu Sep 19 09:38:55:254 2019


DpHdlDeadWp: W18 (pid=983) terminated automatically

Thu Sep 19 09:39:55:541 2019


DpWpDynCreate: created new work process W19-3138

Thu Sep 19 09:45:03:464 2019


DpHdlDeadWp: W19 (pid=3138) terminated automatically

Thu Sep 19 09:55:02:523 2019


DpWpDynCreate: created new work process W16-8270

Thu Sep 19 09:55:34:458 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T104_U15258 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T104_U15258_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |09:14:01|2 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T104_U15258_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Thu Sep 19 09:58:07:667 2019


DpWpDynCreate: created new work process W17-9280

Thu Sep 19 10:00:14:469 2019


DpWpCheck: dyn W16, pid 8270 no longer needed, terminate now

Thu Sep 19 10:00:15:373 2019


DpHdlDeadWp: W16 (pid=8270) terminated automatically

Thu Sep 19 10:02:08:785 2019


DpWpDynCreate: created new work process W20-13496

Thu Sep 19 10:02:15:329 2019


DpWpDynCreate: created new work process W18-13611

Thu Sep 19 10:03:14:476 2019


DpWpCheck: dyn W17, pid 9280 no longer needed, terminate now

Thu Sep 19 10:03:15:179 2019


DpHdlDeadWp: W17 (pid=9280) terminated automatically

Thu Sep 19 10:07:10:335 2019


DpHdlDeadWp: W20 (pid=13496) terminated automatically

Thu Sep 19 10:07:20:198 2019


DpHdlDeadWp: W18 (pid=13611) terminated automatically

Thu Sep 19 10:07:35:432 2019


DpWpDynCreate: created new work process W19-2357

Thu Sep 19 10:12:06:033 2019


DpWpDynCreate: created new work process W16-9491

Thu Sep 19 10:12:36:260 2019


DpHdlDeadWp: W19 (pid=2357) terminated automatically

Thu Sep 19 10:17:14:503 2019


DpWpCheck: dyn W16, pid 9491 no longer needed, terminate now

Thu Sep 19 10:17:15:066 2019


DpHdlDeadWp: W16 (pid=9491) terminated automatically

Thu Sep 19 10:23:43:287 2019


DpWpDynCreate: created new work process W17-13347

Thu Sep 19 10:27:44:181 2019


DpWpDynCreate: created new work process W20-14546

Thu Sep 19 10:28:54:520 2019


DpWpCheck: dyn W17, pid 13347 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=13347) terminated automatically

Thu Sep 19 10:29:06:168 2019


DpWpDynCreate: created new work process W18-14978

Thu Sep 19 10:32:49:556 2019


DpHdlDeadWp: W20 (pid=14546) terminated automatically

Thu Sep 19 10:34:14:528 2019


DpWpCheck: dyn W18, pid 14978 no longer needed, terminate now

Thu Sep 19 10:34:14:841 2019


DpHdlDeadWp: W18 (pid=14978) terminated automatically

Thu Sep 19 10:34:19:287 2019


DpWpDynCreate: created new work process W19-16489

Thu Sep 19 10:34:31:256 2019


DpWpDynCreate: created new work process W16-16625

Thu Sep 19 10:38:14:774 2019


*** ERROR => DpHdlDeadWp: W12 (pid 8833) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W12 (pid = 8833)

********** SERVER SNAPSHOT 172 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:38:14 2019) - begin **********

Server smprd02_SMP_00, Thu Sep 19 10:38:14 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 9, standby_wps 0
#dia = 9
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 7
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 1
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 1
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 7
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 6
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 2

Queue Statistics Thu Sep 19 10:38:14 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 6 (peak 291, writeCount 15219947, readCount 15219941)


UPD : 0 (peak 31, writeCount 3424, readCount 3424)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1396156, readCount 1396156)
SPO : 0 (peak 2, writeCount 16369, readCount 16369)
UP2 : 0 (peak 1, writeCount 1547, readCount 1547)
DISP: 0 (peak 67, writeCount 561157, readCount 561157)
GW : 0 (peak 49, writeCount 14155310, readCount 14155310)
ICM : 0 (peak 186, writeCount 254050, readCount 254050)
LWP : 0 (peak 15, writeCount 24733, readCount 24733)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <T49_U4869_M0> (1 requests):


- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T67_U14937_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T145_U14888_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T52_U15033_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T107_U12124_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T3_U4870_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Thu Sep 19 10:38:14 2019


------------------------------------------------------------

Current snapshot id: 172


DB clean time (in percent of total time) : 24.30 %
Number of preemptions : 85

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 1|23287 |DIA |WP_RUN | |206|low |T83_U15054_M0 |ASYNC_RFC| | |
0| |001|SM_EFWK |REPLOAD |
|
| 3|19909 |DIA |WP_RUN | | |low |T147_U14950_M0 |ASYNC_RFC| | |
4|<HANDLE RFC> |001|SM_EFWK |READSEQ |
|
| 12| |BTC |WP_KILL| |207|low |T118_U14853_M0 |BATCH | | |
| |001|SM_EFWK |REPLOAD |
|

Found 3 active workprocesses


Total number of workprocesses is 18

Session Table Thu Sep 19 10:38:14 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|RFC_UI |T3_U4870_M0 |001|EXT_SCHAITAN| |10:38:00|0 |
SAPMSSY1 |high|1 |
| | 4237|
|SYNC_RFC |T4_U10252_M0 |001|SMD_RFC |smprd02.niladv.org |10:38:13|0 |
SAPMSSY1 |norm| |
| | 4249|
|BGRFC_SCHEDU|T21_U29718_M0 |001|BGRFC_SUSR |smprd02.niladv.org |10:38:02|19 |
SAPMSSY1 |high| |
| | 4247|
|SYNC_RFC |T47_U26485_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |10:38:04|7 |
SAPMSSY1 |norm| |
| | 4234|
|ASYNC_RFC |T49_U4869_M0 |001|EXT_SCHAITAN| |10:38:00|19 |
SAPMSSY1 |low |1 |
| | 4237|
|ASYNC_RFC |T52_U15033_M0 |001|SM_EFWK | |10:38:13|6 |
SAPMSSY1 |low |1 |
| | 4239|
|HTTP_NORMAL |T54_U13520_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:36|4 |
SAPMHTTP |high| |
| | 12827|
|ASYNC_RFC |T59_U14898_M0 |001|SM_EFWK |smprd02.niladv.org |10:38:14|6 |
SAPMSSY1 |low | |
| | 8339|
|ASYNC_RFC |T67_U14937_M0 |001|SM_EFWK | |10:38:08|0 |
SAPMSSY1 |low |1 |
| | 4239|
|SYNC_RFC |T68_U27085_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |10:36:48|2 |
SAPMSSY1 |norm| |
| | 4233|
|ASYNC_RFC |T74_U14943_M0 |001|SM_EFWK |smprd02.niladv.org |10:38:10|2 |
SAPMSSY1 |low | |
| | 4239|
|SYNC_RFC |T82_U6895_M0 |001|SAPJSF |smprd02.niladv.org |10:38:11|7 |
SAPMSSY1 |norm| |
| | 4246|
|ASYNC_RFC |T83_U15054_M0 |001|SM_EFWK |smprd02.niladv.org |10:38:14|1 |
SAPMSSY1 |low | |
| | 4248|
|ASYNC_RFC |T98_U14944_M0 |001|SM_EFWK |smprd02.niladv.org |10:38:11|4 |
SAPMSSY1 |low | |
| | 4256|
|ASYNC_RFC |T107_U12124_M0 |001|SM_EFWK | |10:25:02|3 |
SAPMSSY1 |low |1 |
| | 4239|
|BGRFC_SCHEDU|T109_U29719_M0 |001|BGRFC_SUSR |smprd02.niladv.org |10:36:00|5 |
SAPMSSY1 |high| |
| | 4234|
|HTTP_NORMAL |T116_U13546_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:57|4 |
SAPMHTTP |high| |
| | 12827|
|BATCH |T118_U14853_M0 |001|SM_EFWK | |10:38:14|12 |
E2E_EFWK_RESOURCE_MGR |low | |
| | 12426|
|ASYNC_RFC |T145_U14888_M0 |001|SM_EFWK | |10:38:05|4 |
SAPMSSY1 |low |1 |
| | 41103|
|ASYNC_RFC |T147_U14950_M0 |001|SM_EFWK |smprd02.niladv.org |10:38:10|3 |
SAPMSSY1 |low | |
| | 4213|

Found 20 logons with 20 sessions


Total ES (gross) memory of all sessions: 147 MB
Most ES (gross) memory allocated by T145_U14888_M0: 40 MB

RFC-Connection Table (23 entries) Thu Sep 19 10:38:14 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 8|14674056|14674056SU15033_M0 |T52_U15033_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 6|Thu Sep 19 10:38:13 2019 |
| 9|14450065|14450065SU14888_M0 |T145_U14888_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 4|Thu Sep 19 10:38:05 2019 |
| 12|76511970|76511970SU4869_M0 |T49_U4869_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 2|Thu Sep 19 00:00:01 2019 |
| 26|05288082|05288082CU12124_M0 |T107_U12124_M0_I|ALLOCATED |
CLIENT|SAP_SEND | 3|Thu Sep 19 10:25:02 2019 |
| 31|14525263|14525263SU14944_M0 |T98_U14944_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 0|Thu Sep 19 10:38:08 2019 |
| 36|14533542|14533542CU14944_M0 |T98_U14944_M0_I0|ALLOCATED |
CLIENT|SAP_SEND | 4|Thu Sep 19 10:38:11 2019 |
| 40|14524257|14524257SU14943_M0 |T74_U14943_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 0|Thu Sep 19 10:38:08 2019 |
| 46|91361270|91361270SU27085_M0 |T68_U27085_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 2|Thu Sep 19 10:36:48 2019 |
| 49|13935998|13935998SU12124_M0 |T107_U12124_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 3|Thu Sep 19 10:25:02 2019 |
| 57|14723176|14723176CU14898_M0 |T59_U14898_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 6|Thu Sep 19 10:38:14 2019 |
| 88|14518120|14518120SU14937_M0 |T67_U14937_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 0|Thu Sep 19 10:38:08 2019 |
| 110|13778491|13778491SU10252_M0 |T4_U10252_M0_I0 |ALLOCATED |
SERVER|SAP_SEND | 0|Thu Sep 19 10:38:13 2019 |
| 128|14524257|14524257CU14937_M0 |T67_U14937_M0_I0|ALLOCATED |
CLIENT|SAP_SEND | 0|Thu Sep 19 10:38:08 2019 |
| 168|91508324|91508324SU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
SERVER|RECEIVE | 7|Thu Sep 19 10:38:04 2019 |
| 170|12561360|12561360SU6895_M0 |T82_U6895_M0_I0 |ALLOCATED |
SERVER|SAP_SEND | 7|Thu Sep 19 10:38:11 2019 |
| 175|14525263|14525263CU14943_M0 |T74_U14943_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 2|Thu Sep 19 10:38:10 2019 |
| 176|06119079|06119079CU15033_M0 |T52_U15033_M0_I0|ALLOCATED |
CLIENT|SAP_SEND | 6|Thu Sep 19 10:38:13 2019 |
| 177|14462887|14462887SU14898_M0 |T59_U14898_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Thu Sep 19 10:38:05 2019 |
| 193|14723176|14723176SU15054_M0 |T83_U15054_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Thu Sep 19 10:38:14 2019 |
| 198|14462887|14462887CU14888_M0 |T145_U14888_M0_I|ALLOCATED |
CLIENT|SAP_SEND | 4|Thu Sep 19 10:38:05 2019 |
| 210|76514010|76514010SU4870_M0 |T3_U4870_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 2|Thu Sep 19 00:00:02 2019 |
| 255|91361270|91361270CU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 0|Thu Sep 19 07:16:53 2019 |
| 272|14533542|14533542SU14950_M0 |T147_U14950_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 3|Thu Sep 19 10:38:10 2019 |

Found 23 RFC-Connections

CA Blocks
------------------------------------------------------------
1 WORKER 23287
5 WORKER 19909
336 INVALID -1
338 INVALID -1
339 INVALID -1
343 INVALID -1
344 INVALID -1
346 INVALID -1
8 ca_blk slots of 6000 in use, 6 currently unowned (in request queues)

MPI Info Thu Sep 19 10:38:14 2019


------------------------------------------------------------
Current pipes in use: 7
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Thu Sep 19 10:38:14 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 3718| 105| |
|
| 1|DDLOG | 3718| 105| |
|
| 2|BTCSCHED | 7437| 47| |
|
| 3|RESTART_ALL | 1487| 102| |
|
| 4|ENVCHECK | 22314| 20| |
|
| 5|AUTOABAP | 1487| 102| |
|
| 6|BGRFC_WATCHDOG | 1488| 102| |
|
| 7|AUTOTH | 639| 47| |
|
| 8|AUTOCCMS | 7437| 47| |
|
| 9|AUTOSECURITY | 7437| 47| |
|
| 10|LOAD_CALCULATION | 445644| 1| |
|
| 11|SPOOLALRM | 7438| 47| |
|
| 12|CALL_DELAYED | 0| 497| |
|
| 13|TIMEOUT | 0| 46|T49_U4869_M0 |
41058139|
| 14|TIMEOUT | 0| 166|T109_U29719_M0 |
41051746|
| 16|TIMEOUT | 0| 288|T21_U29718_M0 |
41058276|
| 17|TIMEOUT | 0| 46|T3_U4870_M0 |
41058136|

Found 17 periodic tasks

********** SERVER SNAPSHOT 172 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:38:14 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W12-17816

Thu Sep 19 10:38:16:539 2019


*** ERROR => DpHdlDeadWp: W12 (pid 17816) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=17816) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 17816)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:38:34:775 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:38:54:775 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:39:14:776 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 18076

Thu Sep 19 10:39:23:148 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot
DpWpCheck: dyn W19, pid 16489 no longer needed, terminate now

Thu Sep 19 10:39:24:282 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot
DpHdlDeadWp: W19 (pid=16489) terminated automatically

Thu Sep 19 10:39:34:777 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot
DpWpCheck: dyn W16, pid 16625 no longer needed, terminate now
DpCheckSapcontrolProcess: sapcontrol with pid 18076 terminated

Thu Sep 19 10:39:35:301 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot
DpHdlDeadWp: W16 (pid=16625) terminated automatically

Thu Sep 19 10:39:54:777 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:40:14:778 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-18683

Thu Sep 19 10:40:16:479 2019


*** ERROR => DpHdlDeadWp: W12 (pid 18683) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=18683) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 18683)
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:40:34:778 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:40:54:779 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:41:14:780 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:41:34:780 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:41:54:780 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:42:14:781 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:42:34:782 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:42:54:782 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:43:14:783 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:43:34:784 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:43:54:784 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:39:14 2019, skip new
snapshot

Thu Sep 19 10:44:14:784 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 173 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:44:14 2019) - begin **********

Server smprd02_SMP_00, Thu Sep 19 10:44:14 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 8, standby_wps 0
#dia = 8
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Thu Sep 19 10:44:14 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 3 (peak 291, writeCount 15220413, readCount 15220410)


UPD : 0 (peak 31, writeCount 3427, readCount 3427)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1396164, readCount 1396164)
SPO : 0 (peak 2, writeCount 16383, readCount 16383)
UP2 : 0 (peak 1, writeCount 1549, readCount 1549)
DISP: 0 (peak 67, writeCount 561213, readCount 561213)
GW : 0 (peak 49, writeCount 14155508, readCount 14155508)
ICM : 1 (peak 186, writeCount 254078, readCount 254077)
LWP : 1 (peak 15, writeCount 24767, readCount 24766)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T49_U4869_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T52_U15033_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T107_U12124_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:


Queue <DispatcherQueue> in slot 0 (port=18802) has no requests
Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Thu Sep 19 10:44:14 2019


------------------------------------------------------------

Current snapshot id: 173


DB clean time (in percent of total time) : 24.30 %
Number of preemptions : 85

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 12| |BTC |WP_KILL| |209|low |T118_U14853_M0 |BATCH | | |
|CL_E2EEFWK_RESOURCE_MGR=======CP |001|SM_EFWK |REPLOAD |
|

Found 1 active workprocesses


Total number of workprocesses is 16

Session Table Thu Sep 19 10:44:14 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|SYNC_RFC |T4_U10252_M0 |001|SMD_RFC |smprd02.niladv.org |10:38:13|0 |
SAPMSSY1 |norm| |
| | 4249|
|SYNC_RFC |T47_U26485_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |10:44:04|4 |
SAPMSSY1 |norm| |
| | 4234|
|ASYNC_RFC |T49_U4869_M0 |001|EXT_SCHAITAN| |10:44:00|7 |
SAPMSSY1 |low |1 |
| | 4237|
|ASYNC_RFC |T52_U15033_M0 |001|SM_EFWK | |10:38:13|6 |
SAPMSSY1 |low |1 |
| | 4239|
|HTTP_NORMAL |T54_U13520_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:36|4 |
SAPMHTTP |high| |
| | 12827|
|ASYNC_RFC |T107_U12124_M0 |001|SM_EFWK | |10:25:02|3 |
SAPMSSY1 |low |1 |
| | 4239|
|HTTP_NORMAL |T116_U13546_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:57|4 |
SAPMHTTP |high| |
| | 12827|
|BATCH |T118_U14853_M0 |001|SM_EFWK | |10:38:14|12 |
E2E_EFWK_RESOURCE_MGR |low | |
| | 12426|
Found 8 logons with 8 sessions
Total ES (gross) memory of all sessions: 57 MB
Most ES (gross) memory allocated by T54_U13520_M0: 12 MB

RFC-Connection Table (7 entries) Thu Sep 19 10:44:14 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 8|14674056|14674056SU15033_M0 |T52_U15033_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 6|Thu Sep 19 10:38:13 2019 |
| 12|76511970|76511970SU4869_M0 |T49_U4869_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 2|Thu Sep 19 00:00:01 2019 |
| 26|05288082|05288082CU12124_M0 |T107_U12124_M0_I|ALLOCATED |
CLIENT|SAP_SEND | 3|Thu Sep 19 10:25:02 2019 |
| 49|13935998|13935998SU12124_M0 |T107_U12124_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 3|Thu Sep 19 10:25:02 2019 |
| 110|13778491|13778491SU10252_M0 |T4_U10252_M0_I0 |ALLOCATED |
SERVER|SAP_SEND | 0|Thu Sep 19 10:38:13 2019 |
| 168|91508324|91508324SU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
SERVER|RECEIVE | 4|Thu Sep 19 10:44:04 2019 |
| 176|06119079|06119079CU15033_M0 |T52_U15033_M0_I0|ALLOCATED |
CLIENT|SAP_SEND | 6|Thu Sep 19 10:38:13 2019 |

Found 7 RFC-Connections

CA Blocks
------------------------------------------------------------
336 INVALID -1
343 INVALID -1
344 INVALID -1
3 ca_blk slots of 6000 in use, 3 currently unowned (in request queues)

MPI Info Thu Sep 19 10:44:14 2019


------------------------------------------------------------
Current pipes in use: 127
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Thu Sep 19 10:44:14 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 3721| 105| |
|
| 1|DDLOG | 3721| 105| |
|
| 2|BTCSCHED | 7443| 47| |
|
| 3|RESTART_ALL | 1488| 42| |
|
| 4|ENVCHECK | 22332| 20| |
|
| 5|AUTOABAP | 1488| 42| |
|
| 6|BGRFC_WATCHDOG | 1489| 42| |
|
| 7|AUTOTH | 645| 47| |
|
| 8|AUTOCCMS | 7443| 47| |
|
| 9|AUTOSECURITY | 7443| 47| |
|
| 10|LOAD_CALCULATION | 446003| 1| |
|
| 11|SPOOLALRM | 7444| 47| |
|
| 12|CALL_DELAYED | 0| 137| |
|
| 13|TIMEOUT | 0| 46|T49_U4869_M0 |
41063768|

Found 14 periodic tasks

********** SERVER SNAPSHOT 173 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:44:14 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]

Thu Sep 19 10:44:34:784 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:44:54:785 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:45:14:786 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-20217
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 20218

Thu Sep 19 10:45:16:490 2019


*** ERROR => DpHdlDeadWp: W12 (pid 20217) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20217) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 20217)
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:45:21:739 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:45:34:787 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 20218 terminated

Thu Sep 19 10:45:54:788 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:46:14:789 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:46:34:789 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:46:54:790 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:47:14:790 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:47:34:791 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:47:54:791 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:48:14:791 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:48:34:792 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:48:54:793 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:49:14:793 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:49:34:794 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:49:54:794 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:45:14 2019, skip new
snapshot

Thu Sep 19 10:50:14:795 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 174 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:50:14 2019) - begin **********

Server smprd02_SMP_00, Thu Sep 19 10:50:14 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 8, standby_wps 0
#dia = 8
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Thu Sep 19 10:50:14 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000


DIA : 2 (peak 291, writeCount 15221118, readCount 15221116)
UPD : 0 (peak 31, writeCount 3428, readCount 3428)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1396168, readCount 1396168)
SPO : 0 (peak 2, writeCount 16396, readCount 16396)
UP2 : 0 (peak 1, writeCount 1550, readCount 1550)
DISP: 0 (peak 67, writeCount 561287, readCount 561287)
GW : 0 (peak 49, writeCount 14155941, readCount 14155941)
ICM : 0 (peak 186, writeCount 254105, readCount 254105)
LWP : 1 (peak 15, writeCount 24782, readCount 24781)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W12> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T52_U15033_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T107_U12124_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Thu Sep 19 10:50:14 2019


------------------------------------------------------------

Current snapshot id: 174


DB clean time (in percent of total time) : 24.31 %
Number of preemptions : 85

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 12| |BTC |WP_KILL| |210|low |T118_U14853_M0 |BATCH | | |
|CL_E2EEFWK_RESOURCE_MGR=======CP |001|SM_EFWK |REPLOAD |
|

Found 1 active workprocesses


Total number of workprocesses is 16

Session Table Thu Sep 19 10:50:14 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T2_U15352_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |10:47:19|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|SYNC_RFC |T47_U26485_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |10:50:04|3 |
SAPMSSY1 |norm| |
| | 4234|
|ASYNC_RFC |T52_U15033_M0 |001|SM_EFWK | |10:38:13|6 |
SAPMSSY1 |low |1 |
| | 4239|
|HTTP_NORMAL |T54_U13520_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:36|4 |
SAPMHTTP |high| |
| | 12827|
|ASYNC_RFC |T107_U12124_M0 |001|SM_EFWK | |10:25:02|3 |
SAPMSSY1 |low |1 |
| | 4239|
|HTTP_NORMAL |T116_U13546_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:57|4 |
SAPMHTTP |high| |
| | 12827|
|BATCH |T118_U14853_M0 |001|SM_EFWK | |10:38:14|12 |
E2E_EFWK_RESOURCE_MGR |low | |
| | 12426|

Found 7 logons with 7 sessions


Total ES (gross) memory of all sessions: 53 MB
Most ES (gross) memory allocated by T54_U13520_M0: 12 MB

RFC-Connection Table (5 entries) Thu Sep 19 10:50:14 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 8|14674056|14674056SU15033_M0 |T52_U15033_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 6|Thu Sep 19 10:38:13 2019 |
| 26|05288082|05288082CU12124_M0 |T107_U12124_M0_I|ALLOCATED |
CLIENT|SAP_SEND | 3|Thu Sep 19 10:25:02 2019 |
| 49|13935998|13935998SU12124_M0 |T107_U12124_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 3|Thu Sep 19 10:25:02 2019 |
| 168|91508324|91508324SU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
SERVER|RECEIVE | 3|Thu Sep 19 10:50:04 2019 |
| 176|06119079|06119079CU15033_M0 |T52_U15033_M0_I0|ALLOCATED |
CLIENT|SAP_SEND | 6|Thu Sep 19 10:38:13 2019 |

Found 5 RFC-Connections

CA Blocks
------------------------------------------------------------
343 INVALID -1
344 INVALID -1
2 ca_blk slots of 6000 in use, 2 currently unowned (in request queues)

MPI Info Thu Sep 19 10:50:14 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 0 / 1884
Periodic Tasks Thu Sep 19 10:50:14 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 3724| 105| |
|
| 1|DDLOG | 3724| 105| |
|
| 2|BTCSCHED | 7449| 47| |
|
| 3|RESTART_ALL | 1490| 282| |
|
| 4|ENVCHECK | 22350| 20| |
|
| 5|AUTOABAP | 1490| 282| |
|
| 6|BGRFC_WATCHDOG | 1491| 282| |
|
| 7|AUTOTH | 651| 47| |
|
| 8|AUTOCCMS | 7449| 47| |
|
| 9|AUTOSECURITY | 7449| 47| |
|
| 10|LOAD_CALCULATION | 446361| 0| |
|
| 11|SPOOLALRM | 7450| 47| |
|
| 12|CALL_DELAYED | 0| 1002| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 174 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:50:14 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


DpWpDynCreate: created new work process W12-22364

Thu Sep 19 10:50:16:520 2019


*** ERROR => DpHdlDeadWp: W12 (pid 22364) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=22364) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 22364)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:50:34:795 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:50:54:795 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Thu Sep 19 10:51:14:796 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22622

Thu Sep 19 10:51:21:802 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:51:34:796 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22622 terminated

Thu Sep 19 10:51:54:797 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:52:14:798 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:52:34:798 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:52:54:799 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:53:14:799 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:53:34:800 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:53:54:801 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:54:14:801 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:54:34:802 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:54:54:802 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:55:14:803 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-24227

Thu Sep 19 10:55:16:556 2019


*** ERROR => DpHdlDeadWp: W12 (pid 24227) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=24227) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 24227)
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:55:34:803 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:55:54:803 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:51:14 2019, skip new
snapshot

Thu Sep 19 10:56:14:804 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 175 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:56:14 2019) - begin **********

Server smprd02_SMP_00, Thu Sep 19 10:56:14 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 8, standby_wps 0
#dia = 8
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Thu Sep 19 10:56:14 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 15221933, readCount 15221933)


UPD : 0 (peak 31, writeCount 3429, readCount 3429)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1396172, readCount 1396172)
SPO : 0 (peak 2, writeCount 16409, readCount 16409)
UP2 : 0 (peak 1, writeCount 1551, readCount 1551)
DISP: 0 (peak 67, writeCount 561335, readCount 561335)
GW : 0 (peak 49, writeCount 14156505, readCount 14156505)
ICM : 1 (peak 186, writeCount 254133, readCount 254132)
LWP : 0 (peak 15, writeCount 24797, readCount 24797)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Thu Sep 19 10:56:14 2019


------------------------------------------------------------

Current snapshot id: 175


DB clean time (in percent of total time) : 24.32 %
Number of preemptions : 85

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 12| |BTC |WP_KILL| |212|low |T118_U14853_M0 |BATCH | | |
|CL_E2EEFWK_RESOURCE_MGR=======CP |001|SM_EFWK |REPLOAD |
|

Found 1 active workprocesses


Total number of workprocesses is 16

Session Table Thu Sep 19 10:56:14 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T2_U15352_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |10:47:19|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|SYNC_RFC |T47_U26485_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |10:56:04|4 |
SAPMSSY1 |norm| |
| | 4234|
|HTTP_NORMAL |T54_U13520_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:36|4 |
SAPMHTTP |high| |
| | 12827|
|HTTP_NORMAL |T116_U13546_M0 |001|EXT_SCHAITAN|10.54.36.53 |10:31:57|4 |
SAPMHTTP |high| |
| | 12827|
|BATCH |T118_U14853_M0 |001|SM_EFWK | |10:38:14|12 |
E2E_EFWK_RESOURCE_MGR |low | |
| | 12426|

Found 5 logons with 5 sessions


Total ES (gross) memory of all sessions: 45 MB
Most ES (gross) memory allocated by T54_U13520_M0: 12 MB

RFC-Connection Table (1 entries) Thu Sep 19 10:56:14 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 168|91508324|91508324SU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
SERVER|RECEIVE | 4|Thu Sep 19 10:56:04 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Thu Sep 19 10:56:14 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Thu Sep 19 10:56:14 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 3727| 105| |
|
| 1|DDLOG | 3727| 105| |
|
| 2|BTCSCHED | 7455| 47| |
|
| 3|RESTART_ALL | 1491| 222| |
|
| 4|ENVCHECK | 22368| 20| |
|
| 5|AUTOABAP | 1491| 222| |
|
| 6|BGRFC_WATCHDOG | 1492| 222| |
|
| 7|AUTOTH | 657| 47| |
|
| 8|AUTOCCMS | 7455| 47| |
|
| 9|AUTOSECURITY | 7455| 47| |
|
| 10|LOAD_CALCULATION | 446720| 1| |
|
| 11|SPOOLALRM | 7456| 47| |
|
| 12|CALL_DELAYED | 0| 642| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 175 (Reason: Workprocess 12 died / Time: Thu Sep 19
10:56:14 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]

Thu Sep 19 10:56:34:805 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:56:54:805 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 10:57:14:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 25053
Thu Sep 19 10:57:21:863 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 10:57:34:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 25053 terminated

Thu Sep 19 10:57:54:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 10:58:14:807 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 10:58:34:808 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 10:58:54:809 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 10:59:14:809 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 10:59:34:809 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 10:59:54:810 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 11:00:14:810 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-26263

Thu Sep 19 11:00:16:556 2019


*** ERROR => DpHdlDeadWp: W12 (pid 26263) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=26263) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 26263)
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 11:00:34:811 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 11:00:54:811 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 11:01:14:812 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 11:01:34:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 11:01:54:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 10:57:14 2019, skip new
snapshot

Thu Sep 19 11:02:14:814 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 176 (Reason: Workprocess 12 died / Time: Thu Sep 19
11:02:14 2019) - begin **********

Server smprd02_SMP_00, Thu Sep 19 11:02:14 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 8, standby_wps 0
#dia = 8
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0

Queue Statistics Thu Sep 19 11:02:14 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 15222777, readCount 15222777)


UPD : 0 (peak 31, writeCount 3430, readCount 3430)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1396176, readCount 1396176)
SPO : 0 (peak 2, writeCount 16422, readCount 16422)
UP2 : 0 (peak 1, writeCount 1552, readCount 1552)
DISP: 0 (peak 67, writeCount 561379, readCount 561379)
GW : 1 (peak 49, writeCount 14157071, readCount 14157070)
ICM : 0 (peak 186, writeCount 254165, readCount 254165)
LWP : 0 (peak 15, writeCount 24812, readCount 24812)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <GatewayQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Thu Sep 19 11:02:14 2019


------------------------------------------------------------

Current snapshot id: 176


DB clean time (in percent of total time) : 24.32 %
Number of preemptions : 85

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 12| |BTC |WP_KILL| |213|low |T118_U14853_M0 |BATCH | | |
|CL_E2EEFWK_RESOURCE_MGR=======CP |001|SM_EFWK |REPLOAD |
|

Found 1 active workprocesses


Total number of workprocesses is 16

Session Table Thu Sep 19 11:02:14 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T2_U15352_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |10:47:19|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|SYNC_RFC |T47_U26485_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |11:02:04|3 |
SAPMSSY1 |norm| |
| | 4234|
|BATCH |T118_U14853_M0 |001|SM_EFWK | |10:38:14|12 |
E2E_EFWK_RESOURCE_MGR |low | |
| | 12426|

Found 3 logons with 3 sessions


Total ES (gross) memory of all sessions: 20 MB
Most ES (gross) memory allocated by T118_U14853_M0: 12 MB

RFC-Connection Table (1 entries) Thu Sep 19 11:02:14 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 168|91508324|91508324SU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
SERVER|RECEIVE | 3|Thu Sep 19 11:02:04 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Thu Sep 19 11:02:14 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Thu Sep 19 11:02:14 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 3730| 105| |
|
| 1|DDLOG | 3730| 105| |
|
| 2|BTCSCHED | 7461| 47| |
|
| 3|RESTART_ALL | 1492| 162| |
|
| 4|ENVCHECK | 22386| 20| |
|
| 5|AUTOABAP | 1492| 162| |
|
| 6|BGRFC_WATCHDOG | 1493| 162| |
|
| 7|AUTOTH | 663| 47| |
|
| 8|AUTOCCMS | 7461| 47| |
|
| 9|AUTOSECURITY | 7461| 47| |
|
| 10|LOAD_CALCULATION | 447079| 1| |
|
| 11|SPOOLALRM | 7462| 47| |
|
| 12|CALL_DELAYED | 0| 282| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 176 (Reason: Workprocess 12 died / Time: Thu Sep 19
11:02:14 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]

Thu Sep 19 11:02:34:814 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 11:02:54:815 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 11:03:14:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 32656

Thu Sep 19 11:03:23:002 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:03:34:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 32656 terminated
Thu Sep 19 11:03:54:817 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:04:14:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:04:34:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:04:54:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:05:14:819 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-5597

Thu Sep 19 11:05:16:527 2019


*** ERROR => DpHdlDeadWp: W12 (pid 5597) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=5597) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 5597)
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:05:34:819 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:05:54:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:06:14:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:06:34:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:06:54:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot
Thu Sep 19 11:07:14:820 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:07:34:821 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:07:54:822 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:03:14 2019, skip new
snapshot

Thu Sep 19 11:08:14:822 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 177 (Reason: Workprocess 12 died / Time: Thu Sep 19
11:08:14 2019) - begin **********

Server smprd02_SMP_00, Thu Sep 19 11:08:14 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 8, standby_wps 0
#dia = 8
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 5
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 0
Queue Statistics Thu Sep 19 11:08:14 2019
------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 0 (peak 291, writeCount 15223687, readCount 15223687)


UPD : 0 (peak 31, writeCount 3431, readCount 3431)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1396180, readCount 1396180)
SPO : 0 (peak 2, writeCount 16435, readCount 16435)
UP2 : 0 (peak 1, writeCount 1553, readCount 1553)
DISP: 0 (peak 67, writeCount 561427, readCount 561427)
GW : 0 (peak 49, writeCount 14157719, readCount 14157719)
ICM : 0 (peak 186, writeCount 254192, readCount 254192)
LWP : 0 (peak 15, writeCount 24827, readCount 24827)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Thu Sep 19 11:08:14 2019


------------------------------------------------------------

Current snapshot id: 177


DB clean time (in percent of total time) : 24.33 %
Number of preemptions : 85

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 12| |BTC |WP_KILL| |214|low |T118_U14853_M0 |BATCH | | |
|CL_E2EEFWK_RESOURCE_MGR=======CP |001|SM_EFWK |REPLOAD |
|

Found 1 active workprocesses


Total number of workprocesses is 16

Session Table Thu Sep 19 11:08:14 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T2_U15352_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |10:47:19|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|GUI |T8_U16567_M0 |001|EXT_MKARIM |SST-LAP-LEN0043 |11:05:38|7 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|SYNC_RFC |T47_U26485_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |11:08:04|4 |
SAPMSSY1 |norm| |
| | 4234|
|BATCH |T118_U14853_M0 |001|SM_EFWK | |10:38:14|12 |
E2E_EFWK_RESOURCE_MGR |low | |
| | 12426|

Found 4 logons with 4 sessions


Total ES (gross) memory of all sessions: 24 MB
Most ES (gross) memory allocated by T118_U14853_M0: 12 MB

RFC-Connection Table (1 entries) Thu Sep 19 11:08:14 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 168|91508324|91508324SU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
SERVER|RECEIVE | 4|Thu Sep 19 11:08:04 2019 |

Found 1 RFC-Connections

CA Blocks
------------------------------------------------------------
0 ca_blk slots of 6000 in use, 0 currently unowned (in request queues)

MPI Info Thu Sep 19 11:08:14 2019


------------------------------------------------------------
Current pipes in use: 209
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Thu Sep 19 11:08:14 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 3733| 105| |
|
| 1|DDLOG | 3733| 105| |
|
| 2|BTCSCHED | 7467| 47| |
|
| 3|RESTART_ALL | 1493| 102| |
|
| 4|ENVCHECK | 22404| 20| |
|
| 5|AUTOABAP | 1493| 102| |
|
| 6|BGRFC_WATCHDOG | 1494| 102| |
|
| 7|AUTOTH | 669| 47| |
|
| 8|AUTOCCMS | 7467| 47| |
|
| 9|AUTOSECURITY | 7467| 47| |
|
| 10|LOAD_CALCULATION | 447437| 0| |
|
| 11|SPOOLALRM | 7468| 47| |
|
| 12|CALL_DELAYED | 0| 183| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 177 (Reason: Workprocess 12 died / Time: Thu Sep 19
11:08:14 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]

Thu Sep 19 11:08:34:823 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 11:08:54:824 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 11:09:14:824 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 13315

Thu Sep 19 11:09:25:314 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:09:34:824 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 13315 terminated

Thu Sep 19 11:09:54:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:10:14:826 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot
DpWpDynCreate: created new work process W12-13928

Thu Sep 19 11:10:16:923 2019


*** ERROR => DpHdlDeadWp: W12 (pid 13928) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=13928) exited with exit code 255
DpWpRecoverMutex: recover resources of W12 (pid = 13928)
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:10:34:826 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:10:54:827 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:11:14:827 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:11:34:828 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:11:54:829 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:12:14:830 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:12:33:537 2019


DpWpDynCreate: created new work process W17-15441

Thu Sep 19 11:12:34:836 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:12:54:837 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:12:59:660 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot
DpHdlDeadWp: W13 (pid=4448) terminated automatically
DpWpDynCreate: created new work process W13-16274
Thu Sep 19 11:13:14:838 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:13:34:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:13:54:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Thu Sep 19 11:09:14 2019, skip new
snapshot

Thu Sep 19 11:13:55:875 2019


DpWpDynCreate: created new work process W20-18072

Thu Sep 19 11:14:14:840 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 178 (Reason: Workprocess 12 died / Time: Thu Sep 19
11:14:14 2019) - begin **********

Server smprd02_SMP_00, Thu Sep 19 11:14:14 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 9, standby_wps 0
#dia = 9
#btc = 4
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 7
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 1
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 1
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 7
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 6
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Thu Sep 19 11:14:14 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 2

Max. number of queue elements : 14000

DIA : 3 (peak 291, writeCount 15236514, readCount 15236511)


UPD : 0 (peak 31, writeCount 3539, readCount 3539)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 1397679, readCount 1397679)
SPO : 0 (peak 2, writeCount 16449, readCount 16449)
UP2 : 0 (peak 1, writeCount 1555, readCount 1555)
DISP: 0 (peak 67, writeCount 561917, readCount 561917)
GW : 0 (peak 49, writeCount 14168121, readCount 14168121)
ICM : 0 (peak 186, writeCount 254383, readCount 254383)
LWP : 2 (peak 16, writeCount 24859, readCount 24857)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W2> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W12> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T82_U17271_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T3_U18093_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T81_U17270_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=18689) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Thu Sep 19 11:14:14 2019


------------------------------------------------------------

Current snapshot id: 178


DB clean time (in percent of total time) : 24.33 %
Number of preemptions : 85

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 2|32121 |DIA |WP_RUN | | |low |T68_U18096_M0 |ASYNC_RFC| | |
18|CL_SQL_RESULT_SET=============CP |001|SM_EFWK |READSEQ |
|
| 12| |BTC |WP_KILL| |215|low |T118_U14853_M0 |BATCH | | |
|CL_E2EEFWK_RESOURCE_MGR=======CP |001|SM_EFWK |REPLOAD |
|

Found 2 active workprocesses


Total number of workprocesses is 18

Session Table Thu Sep 19 11:14:14 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|GUI |T2_U15352_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |10:47:19|5 |
SAPMSYST |high| |
|SESSION_MA| 4233|
|ASYNC_RFC |T3_U18093_M0 |001|SM_EFWK | |11:13:56|7 |
SAPMSSY1 |low |1 |
| | 4239|
|SYNC_RFC |T4_U16961_M0 |001|SAPJSF |smprd02.niladv.org |11:13:57|7 |
SAPMSSY1 |norm| |
| | 4246|
|SYNC_RFC |T13_U17179_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |11:11:48|4 |
SAPMSSY1 |norm| |
| | 4202|
|GUI |T30_U17958_M0 |001|EXT_MKARIM |SST-LAP-LEN0043 |11:13:48|4 |
SAPLSDBACCMS |high| |
|DBACOCKPIT| 20636|
|SYNC_RFC |T47_U26485_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |11:14:04|1 |
SAPMSSY1 |norm| |
| | 4234|
|SYNC_RFC |T52_U17792_M0 |001|SMD_RFC |smprd02.niladv.org |11:13:20|2 |
SAPMSSY1 |norm| |
| | 4233|
|ASYNC_RFC |T68_U18096_M0 |001|SM_EFWK |smprd02.niladv.org |11:13:56|2 |
SAPMSSY1 |low | |
| | 4213|
|SYNC_RFC |T74_U17771_M0 |001|SMD_RFC |smprd02.niladv.org |11:14:07|6 |
SAPMSSY1 |norm| |
| | 4248|
|ASYNC_RFC |T81_U17270_M0 |001|EXT_SCHAITAN| |11:14:00|1 |
SAPMSSY1 |low |1 |
| | 4237|
|RFC_UI |T82_U17271_M0 |001|EXT_SCHAITAN| |11:14:00|0 |
SAPMSSY1 |high|1 |
| | 4237|
|SYNC_RFC |T109_U17715_M0 |001|SMD_RFC |smprd02.niladv.org |11:14:10|5 |
SAPMSSY1 |norm| |
| | 4249|
|BATCH |T118_U14853_M0 |001|SM_EFWK | |10:38:14|12 |
E2E_EFWK_RESOURCE_MGR |low | |
| | 12426|
|HTTP_NORMAL |T153_U18044_M0 |001|EXT_MKARIM |10.1.88.42 |11:14:12|4 |
SAPMHTTP |high| |
| | 53788|

Found 14 logons with 14 sessions


Total ES (gross) memory of all sessions: 130 MB
Most ES (gross) memory allocated by T153_U18044_M0: 52 MB

RFC-Connection Table (12 entries) Thu Sep 19 11:14:14 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 12|16608466|16608466SU17792_M0 |T52_U17792_M0_I0|ALLOCATED |
SERVER|SAP_SEND | 2|Thu Sep 19 11:13:20 2019 |
| 26|16690213|16690213SU16961_M0 |T4_U16961_M0_I0 |ALLOCATED |
SERVER|SAP_SEND | 7|Thu Sep 19 11:13:57 2019 |
| 40|17012397|17012397CU18093_M0 |T3_U18093_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 7|Thu Sep 19 11:13:56 2019 |
| 128|17478707|17478707SU17715_M0 |T109_U17715_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Thu Sep 19 11:14:10 2019 |
| 138|16909130|16909130CU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 1|Thu Sep 19 11:11:48 2019 |
| 168|91508324|91508324SU26485_M0 |T47_U26485_M0_I0|ALLOCATED |
SERVER|RECEIVE | 1|Thu Sep 19 11:14:04 2019 |
| 170|16992135|16992135SU17271_M0 |T82_U17271_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 6|Thu Sep 19 11:12:01 2019 |
| 176|16909130|16909130SU17179_M0 |T13_U17179_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Thu Sep 19 11:11:48 2019 |
| 193|17012397|17012397SU18096_M0 |T68_U18096_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 2|Thu Sep 19 11:13:56 2019 |
| 204|16991120|16991120SU17270_M0 |T81_U17270_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 2|Thu Sep 19 11:12:01 2019 |
| 219|17009309|17009309SU18093_M0 |T3_U18093_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 7|Thu Sep 19 11:13:56 2019 |
| 268|16569620|16569620SU17771_M0 |T74_U17771_M0_I0|ALLOCATED |
SERVER|SAP_SEND | 6|Thu Sep 19 11:14:07 2019 |

Found 12 RFC-Connections

CA Blocks
------------------------------------------------------------
3 WORKER 32121
335 INVALID -1
339 INVALID -1
342 INVALID -1
4 ca_blk slots of 6000 in use, 3 currently unowned (in request queues)

MPI Info Thu Sep 19 11:14:14 2019


------------------------------------------------------------
Current pipes in use: 145
Current / maximal blocks in use: 0 / 1884

Periodic Tasks Thu Sep 19 11:14:14 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 3736| 105| |
|
| 1|DDLOG | 3736| 105| |
|
| 2|BTCSCHED | 7473| 47| |
|
| 3|RESTART_ALL | 1494| 42| |
|
| 4|ENVCHECK | 22422| 20| |
|
| 5|AUTOABAP | 1494| 42| |
|
| 6|BGRFC_WATCHDOG | 1495| 42| |
|
| 7|AUTOTH | 675| 47| |
|
| 8|AUTOCCMS | 7473| 47| |
|
| 9|AUTOSECURITY | 7473| 47| |
|
| 10|LOAD_CALCULATION | 447796| 1| |
|
| 11|SPOOLALRM | 7474| 47| |
|
| 12|CALL_DELAYED | 0| 70| |
|
| 14|TIMEOUT | 0| 46|T81_U17270_M0 |
41100903|
| 15|TIMEOUT | 0| 46|T82_U17271_M0 |
41100906|

Found 15 periodic tasks

********** SERVER SNAPSHOT 178 (Reason: Workprocess 12 died / Time: Thu Sep 19
11:14:14 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]

Thu Sep 19 11:14:34:840 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 11:14:54:841 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Thu Sep 19 11:15:14:841 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W12 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W12-23319
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 23320

Thu Sep 19 11:15:34:842 2019


DpCheckSapcontrolProcess: sapcontrol with pid 23320 terminated
Thu Sep 19 11:17:34:844 2019
DpWpCheck: dyn W17, pid 15441 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=15441) terminated automatically

Thu Sep 19 11:18:09:921 2019


DpWpDynCreate: created new work process W18-28994

Thu Sep 19 11:18:22:424 2019


DpWpDynCreate: created new work process W19-28999

Thu Sep 19 11:18:57:063 2019


DpHdlDeadWp: W20 (pid=18072) terminated automatically

Thu Sep 19 11:23:11:692 2019


DpHdlDeadWp: W18 (pid=28994) terminated automatically

Thu Sep 19 11:23:23:192 2019


DpHdlDeadWp: W19 (pid=28999) terminated automatically

Thu Sep 19 11:23:25:378 2019


DpWpDynCreate: created new work process W16-30766

Thu Sep 19 11:28:34:864 2019


DpWpCheck: dyn W16, pid 30766 no longer needed, terminate now

Thu Sep 19 11:28:35:256 2019


DpHdlDeadWp: W16 (pid=30766) terminated automatically

Thu Sep 19 11:30:02:511 2019


DpWpDynCreate: created new work process W17-745

Thu Sep 19 11:33:04:611 2019


DpWpDynCreate: created new work process W20-1744

Thu Sep 19 11:35:03:184 2019


DpHdlDeadWp: W17 (pid=745) terminated automatically

Thu Sep 19 11:38:10:150 2019


DpHdlDeadWp: W20 (pid=1744) terminated automatically

Thu Sep 19 11:38:47:433 2019


DpWpDynCreate: created new work process W18-3842

Thu Sep 19 11:39:04:112 2019


DpWpDynCreate: created new work process W19-3944

Thu Sep 19 11:43:54:888 2019


DpWpCheck: dyn W18, pid 3842 no longer needed, terminate now

Thu Sep 19 11:43:55:281 2019


DpHdlDeadWp: W18 (pid=3842) terminated automatically

Thu Sep 19 11:44:14:888 2019


DpWpCheck: dyn W19, pid 3944 no longer needed, terminate now

Thu Sep 19 11:44:15:433 2019


DpHdlDeadWp: W19 (pid=3944) terminated automatically
Thu Sep 19 12:14:20:121 2019
DpWpDynCreate: created new work process W16-11503

Thu Sep 19 12:17:04:630 2019


DpWpDynCreate: created new work process W17-12416
DpWpDynCreate: created new work process W20-12417

Thu Sep 19 12:19:15:634 2019


DpWpDynCreate: created new work process W18-12985

Thu Sep 19 12:19:22:044 2019


DpHdlDeadWp: W16 (pid=11503) terminated automatically

Thu Sep 19 12:22:14:951 2019


DpWpCheck: dyn W17, pid 12416 no longer needed, terminate now

Thu Sep 19 12:22:15:731 2019


DpHdlDeadWp: W17 (pid=12416) terminated automatically

Thu Sep 19 12:22:24:457 2019


DpHdlDeadWp: W20 (pid=12417) terminated automatically

Thu Sep 19 12:24:16:236 2019


DpHdlDeadWp: W18 (pid=12985) terminated automatically

Thu Sep 19 12:24:17:483 2019


DpWpDynCreate: created new work process W19-14899

Thu Sep 19 12:27:34:636 2019


DpHdlSoftCancel: cancel request for T44_U2291_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CANCEL_REQUEST)
DpHdlSoftCancel: cancel request for T98_U2582_M0 received from GATEWAY
(reason=DP_SOFTCANCEL_ABORT_PROGRAM)

Thu Sep 19 12:29:03:248 2019


DpWpDynCreate: created new work process W16-16523

Thu Sep 19 12:29:03:395 2019


DpWpDynCreate: created new work process W17-16524

Thu Sep 19 12:29:18:611 2019


DpHdlDeadWp: W19 (pid=14899) terminated automatically

Thu Sep 19 12:34:04:773 2019


DpWpCheck: dyn W16, pid 16523 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=16524) terminated automatically

Thu Sep 19 12:34:04:892 2019


DpHdlDeadWp: W16 (pid=16523) terminated automatically

Thu Sep 19 12:39:15:642 2019


DpWpDynCreate: created new work process W20-19781

Thu Sep 19 12:44:19:238 2019


DpHdlDeadWp: W20 (pid=19781) terminated automatically

Thu Sep 19 12:49:16:058 2019


DpWpDynCreate: created new work process W18-23114
Thu Sep 19 12:54:19:222 2019
DpHdlDeadWp: W18 (pid=23114) terminated automatically

Thu Sep 19 12:57:06:800 2019


DpWpDynCreate: created new work process W19-26049

Thu Sep 19 13:00:41:263 2019


DpHdlDeadWp: W11 (pid=30853) terminated automatically
DpWpDynCreate: created new work process W11-27514

Thu Sep 19 13:02:13:389 2019


DpHdlDeadWp: W19 (pid=26049) terminated automatically

Thu Sep 19 13:07:04:276 2019


DpWpDynCreate: created new work process W17-18033

Thu Sep 19 13:12:05:388 2019


DpHdlDeadWp: W17 (pid=18033) terminated automatically

Thu Sep 19 13:18:35:221 2019


***LOG Q0I=> NiIRead: P=10.1.88.72:61520; L=10.54.36.29:3200: recv (104: Connection
reset by peer) [/bas/749_REL/src/base/ni/nixxi.cpp 5420]
*** ERROR => NiIRead: SiRecv failed for hdl 50/sock 16
(SI_ECONN_BROKEN/104; I4; ST; P=10.1.88.72:61520; L=10.54.36.29:3200)
[nixxi.cpp 5420]
***LOG Q04=> DpRTmPrepareReq, NiBufReceive (18494EXT_SCHAITAN153 SST-LAP-HP0055)
[dpTerminal.c 736]
|GUI |T153_U18494_M2 |001|EXT_SCHAITAN|SST-LAP-HP0055 |12:58:19|5 |
SAPMSUU0 |high| |
|SU01 |
DpRTmPrepareReq: network error of client T153_U18494 in state DP_LOGGED_IN:
NiBufReceive (-6: NIECONN_BROKEN)
NiHLGetHostName: got address 10.1.88.72 from operating system
NiIGetHostName: addr 10.1.88.72 = hostname 'hydn7381.niladv.org'
DpRTmPrepareReq: client address of T153_U18494 is 10.1.88.72(hydn7381.niladv.org)
DpHdlSoftCancel: cancel request for T153_U18494_M1 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)
DpHdlSoftCancel: cancel request for T153_U18494_M2 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Thu Sep 19 13:19:54:034 2019


DpWpDynCreate: created new work process W16-29254

Thu Sep 19 13:23:24:008 2019


DpWpDynCreate: created new work process W20-30512

Thu Sep 19 13:24:55:072 2019


DpWpCheck: dyn W16, pid 29254 no longer needed, terminate now

Thu Sep 19 13:24:55:748 2019


DpHdlDeadWp: W16 (pid=29254) terminated automatically

Thu Sep 19 13:28:35:300 2019


DpHdlDeadWp: W20 (pid=30512) terminated automatically

Thu Sep 19 13:32:04:309 2019


DpWpDynCreate: created new work process W18-1040

Thu Sep 19 13:32:05:292 2019


DpWpDynCreate: created new work process W19-1044

Thu Sep 19 13:37:06:023 2019


DpHdlDeadWp: W18 (pid=1040) terminated automatically
DpWpCheck: dyn W19, pid 1044 no longer needed, terminate now

Thu Sep 19 13:37:07:128 2019


DpHdlDeadWp: W19 (pid=1044) terminated automatically

Thu Sep 19 13:41:03:027 2019


DpWpDynCreate: created new work process W17-4159

Thu Sep 19 13:41:13:658 2019


DpWpDynCreate: created new work process W16-4225

Thu Sep 19 13:46:05:207 2019


DpHdlDeadWp: W17 (pid=4159) terminated automatically

Thu Sep 19 13:46:15:113 2019


DpWpCheck: dyn W16, pid 4225 no longer needed, terminate now

Thu Sep 19 13:46:15:248 2019


DpHdlDeadWp: W16 (pid=4225) terminated automatically

Thu Sep 19 13:46:19:849 2019


DpWpDynCreate: created new work process W20-5914

Thu Sep 19 13:47:08:175 2019


DpWpDynCreate: created new work process W18-6243

Thu Sep 19 13:51:21:414 2019


DpHdlDeadWp: W20 (pid=5914) terminated automatically

Thu Sep 19 13:52:13:654 2019


DpHdlDeadWp: W18 (pid=6243) terminated automatically

Thu Sep 19 13:56:07:021 2019


DpWpDynCreate: created new work process W19-9421

Thu Sep 19 13:58:05:283 2019


DpHdlDeadWp: W10 (pid=31385) terminated automatically
DpWpDynCreate: created new work process W10-10080

Thu Sep 19 14:00:34:969 2019


DpWpDynCreate: created new work process W17-11006
DpWpDynCreate: created new work process W16-11007

Thu Sep 19 14:00:35:304 2019


DpWpDynCreate: created new work process W20-11008

Thu Sep 19 14:01:15:141 2019


DpWpCheck: dyn W19, pid 9421 no longer needed, terminate now

Thu Sep 19 14:01:15:545 2019


DpHdlDeadWp: W19 (pid=9421) terminated automatically

Thu Sep 19 14:05:35:150 2019


DpWpCheck: dyn W16, pid 11007 no longer needed, terminate now
DpWpCheck: dyn W17, pid 11006 no longer needed, terminate now
Thu Sep 19 14:05:35:850 2019
DpHdlDeadWp: W16 (pid=11007) terminated automatically
DpHdlDeadWp: W17 (pid=11006) terminated automatically

Thu Sep 19 14:05:45:175 2019


DpHdlDeadWp: W20 (pid=11008) terminated automatically

Thu Sep 19 14:16:06:051 2019


DpWpDynCreate: created new work process W18-11597

Thu Sep 19 14:16:07:014 2019


DpWpDynCreate: created new work process W19-11600

Thu Sep 19 14:19:15:179 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T68_U18215 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T68_U18215_M0 |001|EXT_MKARIM |SST-LAP-LEN0033 |11:15:19|0 |
SAPLSDBACCMS |high| |
|DBACOCKPIT|
DpHdlSoftCancel: cancel request for T68_U18215_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Thu Sep 19 14:21:07:441 2019


DpHdlDeadWp: W18 (pid=11597) terminated automatically

Thu Sep 19 14:21:08:954 2019


DpHdlDeadWp: W19 (pid=11600) terminated automatically

Thu Sep 19 14:23:30:291 2019


DpWpDynCreate: created new work process W16-14058

Thu Sep 19 14:28:31:233 2019


DpHdlDeadWp: W16 (pid=14058) terminated automatically

Thu Sep 19 14:28:36:243 2019


DpWpDynCreate: created new work process W17-15625

Thu Sep 19 14:30:07:038 2019


DpWpDynCreate: created new work process W20-16157

Thu Sep 19 14:33:43:372 2019


DpHdlDeadWp: W17 (pid=15625) terminated automatically

Thu Sep 19 14:35:02:900 2019


DpWpDynCreate: created new work process W18-17841

Thu Sep 19 14:35:09:476 2019


DpHdlDeadWp: W20 (pid=16157) terminated automatically

Thu Sep 19 14:40:03:760 2019


DpHdlDeadWp: W18 (pid=17841) terminated automatically

Thu Sep 19 14:40:04:112 2019


DpWpDynCreate: created new work process W19-19219

Thu Sep 19 14:45:15:224 2019


DpWpCheck: dyn W19, pid 19219 no longer needed, terminate now
Thu Sep 19 14:45:16:076 2019
DpHdlDeadWp: W19 (pid=19219) terminated automatically

Thu Sep 19 14:48:20:303 2019


DpWpDynCreate: created new work process W16-22079

Thu Sep 19 14:51:06:526 2019


DpWpDynCreate: created new work process W17-22941

Thu Sep 19 14:52:15:238 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T2_U15352 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T2_U15352_M0 |001|EXT_RKUDUMUL|SST-LAP-LEN0028 |10:47:19|5 |
SAPMSYST |high| |
|SESSION_MA|
DpHdlSoftCancel: cancel request for T2_U15352_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Thu Sep 19 14:53:02:961 2019


DpWpDynCreate: created new work process W20-23655

Thu Sep 19 14:53:21:636 2019


DpHdlDeadWp: W16 (pid=22079) terminated automatically

Thu Sep 19 14:56:15:244 2019


DpWpCheck: dyn W17, pid 22941 no longer needed, terminate now

Thu Sep 19 14:56:15:854 2019


DpHdlDeadWp: W17 (pid=22941) terminated automatically

Thu Sep 19 14:58:03:340 2019


DpHdlDeadWp: W20 (pid=23655) terminated automatically

Thu Sep 19 15:07:08:258 2019


DpWpDynCreate: created new work process W18-14972

Thu Sep 19 15:12:02:470 2019


DpWpDynCreate: created new work process W19-25743

Thu Sep 19 15:12:09:711 2019


DpHdlDeadWp: W18 (pid=14972) terminated automatically

Thu Sep 19 15:17:06:063 2019


DpHdlDeadWp: W19 (pid=25743) terminated automatically

Thu Sep 19 15:18:15:333 2019


DpWpDynCreate: created new work process W16-27742

Thu Sep 19 15:23:20:475 2019


DpHdlDeadWp: W16 (pid=27742) terminated automatically

Thu Sep 19 15:27:04:606 2019


DpWpDynCreate: created new work process W17-30600

Thu Sep 19 15:32:15:310 2019


DpWpCheck: dyn W17, pid 30600 no longer needed, terminate now

Thu Sep 19 15:32:15:826 2019


DpHdlDeadWp: W17 (pid=30600) terminated automatically
Thu Sep 19 15:38:22:679 2019
DpWpDynCreate: created new work process W20-1884

Thu Sep 19 15:43:35:327 2019


DpWpCheck: dyn W20, pid 1884 no longer needed, terminate now

Thu Sep 19 15:43:35:671 2019


DpHdlDeadWp: W20 (pid=1884) terminated automatically

Thu Sep 19 15:45:02:834 2019


DpWpDynCreate: created new work process W18-4205

Thu Sep 19 15:50:03:313 2019


DpHdlDeadWp: W18 (pid=4205) terminated automatically

Thu Sep 19 15:50:04:200 2019


DpWpDynCreate: created new work process W19-5941

Thu Sep 19 15:53:02:878 2019


DpWpDynCreate: created new work process W16-6929

Thu Sep 19 15:55:15:349 2019


DpWpCheck: dyn W19, pid 5941 no longer needed, terminate now

Thu Sep 19 15:55:15:528 2019


DpHdlDeadWp: W19 (pid=5941) terminated automatically

Thu Sep 19 15:58:04:530 2019


DpHdlDeadWp: W16 (pid=6929) terminated automatically

Thu Sep 19 16:03:24:987 2019


DpWpDynCreate: created new work process W17-18463

Thu Sep 19 16:05:02:401 2019


DpWpDynCreate: created new work process W20-24755

Thu Sep 19 16:05:03:185 2019


DpWpDynCreate: created new work process W18-24758

Thu Sep 19 16:08:36:202 2019


DpHdlDeadWp: W17 (pid=18463) terminated automatically

Thu Sep 19 16:10:05:461 2019


DpHdlDeadWp: W18 (pid=24758) terminated automatically
DpWpCheck: dyn W20, pid 24755 no longer needed, terminate now

Thu Sep 19 16:10:06:395 2019


DpHdlDeadWp: W20 (pid=24755) terminated automatically

Thu Sep 19 16:12:05:810 2019


DpWpDynCreate: created new work process W19-8959

Thu Sep 19 16:13:04:583 2019


DpWpDynCreate: created new work process W16-9146
DpWpDynCreate: created new work process W17-9147

Thu Sep 19 16:17:13:965 2019


DpHdlDeadWp: W19 (pid=8959) terminated automatically
Thu Sep 19 16:18:05:566 2019
DpWpCheck: dyn W16, pid 9146 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=9147) terminated automatically

Thu Sep 19 16:18:06:089 2019


DpHdlDeadWp: W16 (pid=9146) terminated automatically

Thu Sep 19 16:20:04:902 2019


DpWpDynCreate: created new work process W18-11747

Thu Sep 19 16:25:05:404 2019


DpHdlDeadWp: W18 (pid=11747) terminated automatically

Thu Sep 19 16:25:18:765 2019


DpWpDynCreate: created new work process W20-13263

Thu Sep 19 16:27:14:038 2019


DpWpDynCreate: created new work process W19-14054

Thu Sep 19 16:30:35:405 2019


DpWpCheck: dyn W20, pid 13263 no longer needed, terminate now

Thu Sep 19 16:30:35:694 2019


DpHdlDeadWp: W20 (pid=13263) terminated automatically

Thu Sep 19 16:32:15:707 2019


DpHdlDeadWp: W19 (pid=14054) terminated automatically

Thu Sep 19 16:32:58:427 2019


DpWpDynCreate: created new work process W17-15844

Thu Sep 19 16:38:03:527 2019


DpHdlDeadWp: W17 (pid=15844) terminated automatically
DpWpDynCreate: created new work process W16-17477

Thu Sep 19 16:43:04:887 2019


DpHdlDeadWp: W16 (pid=17477) terminated automatically

Thu Sep 19 16:56:08:772 2019


DpWpDynCreate: created new work process W18-23195

Thu Sep 19 17:01:15:762 2019


DpWpCheck: dyn W18, pid 23195 no longer needed, terminate now

Thu Sep 19 17:01:17:127 2019


DpHdlDeadWp: W18 (pid=23195) terminated automatically

Thu Sep 19 17:14:15:392 2019


DpWpDynCreate: created new work process W20-24875

Thu Sep 19 17:19:17:761 2019


DpHdlDeadWp: W20 (pid=24875) terminated automatically

Thu Sep 19 17:21:08:287 2019


DpHdlDeadWp: W9 (pid=28886) terminated automatically
DpWpDynCreate: created new work process W9-27205

Thu Sep 19 17:22:54:517 2019


DpHdlDeadWp: W13 (pid=16274) terminated automatically
DpWpDynCreate: created new work process W13-27670

Thu Sep 19 17:31:48:625 2019


DpWpDynCreate: created new work process W19-30644

Thu Sep 19 17:36:50:187 2019


DpHdlDeadWp: W19 (pid=30644) terminated automatically

Thu Sep 19 17:37:05:210 2019


DpWpDynCreate: created new work process W17-32393

Thu Sep 19 17:40:18:641 2019


DpWpDynCreate: created new work process W16-939

Thu Sep 19 17:42:07:651 2019


DpHdlDeadWp: W17 (pid=32393) terminated automatically

Thu Sep 19 17:45:21:950 2019


DpHdlDeadWp: W16 (pid=939) terminated automatically

Thu Sep 19 17:49:05:858 2019


DpWpDynCreate: created new work process W18-3927

Thu Sep 19 17:52:08:185 2019


DpWpDynCreate: created new work process W20-4890

Thu Sep 19 17:54:15:844 2019


DpWpCheck: dyn W18, pid 3927 no longer needed, terminate now

Thu Sep 19 17:54:16:303 2019


DpHdlDeadWp: W18 (pid=3927) terminated automatically

Thu Sep 19 17:55:21:741 2019


DpWpDynCreate: created new work process W19-6013

Thu Sep 19 17:57:15:850 2019


DpWpCheck: dyn W20, pid 4890 no longer needed, terminate now

Thu Sep 19 17:57:16:449 2019


DpHdlDeadWp: W20 (pid=4890) terminated automatically

Thu Sep 19 18:00:22:618 2019


DpHdlDeadWp: W19 (pid=6013) terminated automatically

Thu Sep 19 18:06:08:642 2019


DpWpDynCreate: created new work process W17-26223

Thu Sep 19 18:07:23:510 2019


DpWpDynCreate: created new work process W16-31031

Thu Sep 19 18:11:10:156 2019


DpHdlDeadWp: W17 (pid=26223) terminated automatically

Thu Sep 19 18:12:35:874 2019


DpWpCheck: dyn W16, pid 31031 no longer needed, terminate now

Thu Sep 19 18:12:36:223 2019


DpHdlDeadWp: W16 (pid=31031) terminated automatically
Thu Sep 19 18:13:05:913 2019
DpWpDynCreate: created new work process W18-7682

Thu Sep 19 18:18:05:449 2019


DpWpDynCreate: created new work process W20-9364

Thu Sep 19 18:18:06:862 2019


DpHdlDeadWp: W18 (pid=7682) terminated automatically

Thu Sep 19 18:18:17:279 2019


DpWpDynCreate: created new work process W19-9436

Thu Sep 19 18:23:08:014 2019


DpHdlDeadWp: W20 (pid=9364) terminated automatically

Thu Sep 19 18:23:35:891 2019


DpWpCheck: dyn W19, pid 9436 no longer needed, terminate now

Thu Sep 19 18:23:36:225 2019


DpHdlDeadWp: W19 (pid=9436) terminated automatically

Thu Sep 19 18:25:25:737 2019


DpWpDynCreate: created new work process W17-11768

Thu Sep 19 18:27:05:527 2019


DpWpDynCreate: created new work process W16-12414

Thu Sep 19 18:30:35:902 2019


DpWpCheck: dyn W17, pid 11768 no longer needed, terminate now

Thu Sep 19 18:30:36:907 2019


DpHdlDeadWp: W17 (pid=11768) terminated automatically

Thu Sep 19 18:32:15:905 2019


DpWpCheck: dyn W16, pid 12414 no longer needed, terminate now

Thu Sep 19 18:32:16:975 2019


DpHdlDeadWp: W16 (pid=12414) terminated automatically

Thu Sep 19 18:34:29:410 2019


*** ERROR => NiIRead: invalid data (0x300002f/0x8800;mode=0;hdl
52;peer=92.63.194.50:1134;local=3200) [nixxi.cpp 5226]

Thu Sep 19 18:43:22:579 2019


DpWpDynCreate: created new work process W18-17646

Thu Sep 19 18:48:24:654 2019


DpHdlDeadWp: W18 (pid=17646) terminated automatically

Thu Sep 19 18:58:26:315 2019


DpWpDynCreate: created new work process W20-22466

Thu Sep 19 19:03:28:486 2019


DpHdlDeadWp: W20 (pid=22466) terminated automatically

Thu Sep 19 19:10:35:971 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T122_U10965 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T122_U10965_M0 |001|EXT_MKARIM |SST-LAP-HP0002 |17:48:32|7 |
SAPLSDBACCMS |high| |
|DBACOCKPIT|
DpHdlSoftCancel: cancel request for T122_U10965_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Thu Sep 19 19:13:12:727 2019


DpWpDynCreate: created new work process W19-22190

Thu Sep 19 19:13:12:993 2019


DpWpDynCreate: created new work process W17-22199

Thu Sep 19 19:16:02:428 2019


DpWpDynCreate: created new work process W16-23959

Thu Sep 19 19:18:15:727 2019


DpWpCheck: dyn W17, pid 22199 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=22190) terminated automatically
DpHdlDeadWp: W17 (pid=22199) terminated automatically

Thu Sep 19 19:21:10:632 2019


DpHdlDeadWp: W16 (pid=23959) terminated automatically

Thu Sep 19 19:23:16:627 2019


DpWpDynCreate: created new work process W18-26706

Thu Sep 19 19:25:21:043 2019


DpWpDynCreate: created new work process W20-27359

Thu Sep 19 19:28:17:300 2019


DpHdlDeadWp: W18 (pid=26706) terminated automatically

Thu Sep 19 19:30:36:003 2019


DpWpCheck: dyn W20, pid 27359 no longer needed, terminate now

Thu Sep 19 19:30:37:089 2019


DpHdlDeadWp: W20 (pid=27359) terminated automatically

Thu Sep 19 19:33:07:344 2019


DpWpDynCreate: created new work process W19-29920

Thu Sep 19 19:36:10:545 2019


DpWpDynCreate: created new work process W17-30817

Thu Sep 19 19:38:10:249 2019


DpHdlDeadWp: W19 (pid=29920) terminated automatically

Thu Sep 19 19:41:16:025 2019


DpWpCheck: dyn W17, pid 30817 no longer needed, terminate now

Thu Sep 19 19:41:16:406 2019


DpHdlDeadWp: W17 (pid=30817) terminated automatically

Thu Sep 19 19:44:20:487 2019


DpWpDynCreate: created new work process W16-981

Thu Sep 19 19:47:08:406 2019


DpWpDynCreate: created new work process W18-1936
Thu Sep 19 19:49:36:041 2019
DpWpCheck: dyn W16, pid 981 no longer needed, terminate now

Thu Sep 19 19:49:36:903 2019


DpHdlDeadWp: W16 (pid=981) terminated automatically

Thu Sep 19 19:50:06:346 2019


DpWpDynCreate: created new work process W20-3164

Thu Sep 19 19:52:16:045 2019


DpWpCheck: dyn W18, pid 1936 no longer needed, terminate now

Thu Sep 19 19:52:17:048 2019


DpHdlDeadWp: W18 (pid=1936) terminated automatically

Thu Sep 19 19:53:05:705 2019


DpWpDynCreate: created new work process W19-4147

Thu Sep 19 19:55:09:516 2019


DpHdlDeadWp: W20 (pid=3164) terminated automatically

Thu Sep 19 19:58:05:227 2019


DpHdlDeadWp: W9 (pid=27205) terminated automatically
DpWpDynCreate: created new work process W9-5658

Thu Sep 19 19:58:16:053 2019


DpWpCheck: dyn W19, pid 4147 no longer needed, terminate now

Thu Sep 19 19:58:16:666 2019


DpHdlDeadWp: W19 (pid=4147) terminated automatically

Thu Sep 19 19:58:44:696 2019


DpWpDynCreate: created new work process W17-6012

Thu Sep 19 20:02:21:921 2019


DpWpDynCreate: created new work process W16-10293

Thu Sep 19 20:03:56:069 2019


DpWpCheck: dyn W17, pid 6012 no longer needed, terminate now

Thu Sep 19 20:03:56:710 2019


DpHdlDeadWp: W17 (pid=6012) terminated automatically

Thu Sep 19 20:04:08:671 2019


DpWpDynCreate: created new work process W18-14869

Thu Sep 19 20:06:32:349 2019


DpWpDynCreate: created new work process W20-21954

Thu Sep 19 20:07:28:006 2019


DpHdlDeadWp: W16 (pid=10293) terminated automatically

Thu Sep 19 20:09:16:077 2019


DpWpCheck: dyn W18, pid 14869 no longer needed, terminate now

Thu Sep 19 20:09:17:772 2019


DpHdlDeadWp: W18 (pid=14869) terminated automatically

Thu Sep 19 20:11:36:083 2019


DpWpCheck: dyn W20, pid 21954 no longer needed, terminate now

Thu Sep 19 20:11:36:315 2019


DpHdlDeadWp: W20 (pid=21954) terminated automatically

Thu Sep 19 20:18:08:121 2019


DpWpDynCreate: created new work process W19-8550

Thu Sep 19 20:20:54:262 2019


DpWpDynCreate: created new work process W17-9385

Thu Sep 19 20:23:16:104 2019


DpWpCheck: dyn W19, pid 8550 no longer needed, terminate now

Thu Sep 19 20:23:16:349 2019


DpHdlDeadWp: W19 (pid=8550) terminated automatically

Thu Sep 19 20:25:56:108 2019


DpWpCheck: dyn W17, pid 9385 no longer needed, terminate now

Thu Sep 19 20:25:56:476 2019


DpHdlDeadWp: W17 (pid=9385) terminated automatically

Thu Sep 19 20:27:24:149 2019


DpWpDynCreate: created new work process W16-11519

Thu Sep 19 20:29:05:461 2019


DpWpDynCreate: created new work process W18-12055

Thu Sep 19 20:32:26:150 2019


DpHdlDeadWp: W16 (pid=11519) terminated automatically

Thu Sep 19 20:34:16:126 2019


DpWpCheck: dyn W18, pid 12055 no longer needed, terminate now
*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T10_U3348 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T10_U3348_M0 |001|SOLMAN_ADMIN|SST-LAP-HP0055 |20:13:10|6 |
RSENQRR2 |high| |
|SM12 |
DpHdlSoftCancel: cancel request for T10_U3348_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)
DpHdlDeadWp: W18 (pid=12055) terminated automatically

Thu Sep 19 20:36:07:414 2019


DpWpDynCreate: created new work process W20-14355

Thu Sep 19 20:36:14:809 2019


DpWpDynCreate: created new work process W19-14360

Thu Sep 19 20:41:15:582 2019


DpHdlDeadWp: W19 (pid=14360) terminated automatically
DpWpCheck: dyn W20, pid 14355 no longer needed, terminate now

Thu Sep 19 20:41:16:068 2019


DpHdlDeadWp: W20 (pid=14355) terminated automatically

Thu Sep 19 20:42:04:514 2019


DpWpDynCreate: created new work process W17-16098
Thu Sep 19 20:47:16:147 2019
DpWpCheck: dyn W17, pid 16098 no longer needed, terminate now

Thu Sep 19 20:47:16:923 2019


DpHdlDeadWp: W17 (pid=16098) terminated automatically

Thu Sep 19 20:51:14:962 2019


DpWpDynCreate: created new work process W16-19000

Thu Sep 19 20:56:08:782 2019


DpHdlDeadWp: W13 (pid=27670) terminated automatically
DpWpDynCreate: created new work process W13-20790

Thu Sep 19 20:56:33:703 2019


DpHdlDeadWp: W16 (pid=19000) terminated automatically

Thu Sep 19 20:59:04:956 2019


DpWpDynCreate: created new work process W18-21693

Thu Sep 19 21:04:16:188 2019


DpWpCheck: dyn W18, pid 21693 no longer needed, terminate now

Thu Sep 19 21:04:17:087 2019


DpHdlDeadWp: W18 (pid=21693) terminated automatically

Thu Sep 19 21:07:12:995 2019


DpWpDynCreate: created new work process W19-11136

Thu Sep 19 21:07:14:175 2019


DpWpDynCreate: created new work process W20-11159

Thu Sep 19 21:12:14:542 2019


DpHdlDeadWp: W19 (pid=11136) terminated automatically
DpWpDynCreate: created new work process W19-21447

Thu Sep 19 21:12:15:232 2019


DpHdlDeadWp: W20 (pid=11159) terminated automatically

Thu Sep 19 21:17:15:544 2019


DpHdlDeadWp: W19 (pid=21447) terminated automatically

Thu Sep 19 21:17:17:536 2019


DpWpDynCreate: created new work process W17-23238

Thu Sep 19 21:22:18:960 2019


DpHdlDeadWp: W17 (pid=23238) terminated automatically

Thu Sep 19 21:29:17:786 2019


DpWpDynCreate: created new work process W16-27521

Thu Sep 19 21:30:07:453 2019


DpWpDynCreate: created new work process W18-27828

Thu Sep 19 21:34:36:236 2019


DpWpCheck: dyn W16, pid 27521 no longer needed, terminate now

Thu Sep 19 21:34:36:535 2019


DpHdlDeadWp: W16 (pid=27521) terminated automatically
Thu Sep 19 21:35:16:238 2019
DpWpCheck: dyn W18, pid 27828 no longer needed, terminate now

Thu Sep 19 21:35:16:662 2019


DpHdlDeadWp: W18 (pid=27828) terminated automatically

Thu Sep 19 21:39:06:709 2019


DpWpDynCreate: created new work process W20-30640

Thu Sep 19 21:44:08:576 2019


DpHdlDeadWp: W20 (pid=30640) terminated automatically

Thu Sep 19 21:47:21:094 2019


DpWpDynCreate: created new work process W19-1019

Thu Sep 19 21:52:11:003 2019


DpWpDynCreate: created new work process W17-2869

Thu Sep 19 21:52:11:967 2019


DpWpDynCreate: created new work process W16-2872

Thu Sep 19 21:52:36:395 2019


DpHdlDeadWp: W19 (pid=1019) terminated automatically

Thu Sep 19 21:57:12:380 2019


DpWpCheck: dyn W16, pid 2872 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=2869) terminated automatically

Thu Sep 19 21:57:12:602 2019


DpHdlDeadWp: W16 (pid=2872) terminated automatically

Thu Sep 19 22:02:23:228 2019


DpWpDynCreate: created new work process W18-10213

Thu Sep 19 22:03:26:309 2019


DpWpDynCreate: created new work process W20-14267

Thu Sep 19 22:07:32:418 2019


DpHdlDeadWp: W18 (pid=10213) terminated automatically

Thu Sep 19 22:08:36:423 2019


DpWpCheck: dyn W20, pid 14267 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=14267) terminated automatically

Thu Sep 19 22:18:04:837 2019


DpWpDynCreate: created new work process W19-7091

Thu Sep 19 22:23:08:175 2019


DpHdlDeadWp: W19 (pid=7091) terminated automatically

Thu Sep 19 22:27:15:099 2019


DpWpDynCreate: created new work process W17-10314

Thu Sep 19 22:32:16:464 2019


DpWpCheck: dyn W17, pid 10314 no longer needed, terminate now

Thu Sep 19 22:32:16:691 2019


DpHdlDeadWp: W17 (pid=10314) terminated automatically
Thu Sep 19 22:32:24:686 2019
DpWpDynCreate: created new work process W16-12086

Thu Sep 19 22:34:11:353 2019


DpWpDynCreate: created new work process W18-12662

Thu Sep 19 22:37:36:472 2019


DpWpCheck: dyn W16, pid 12086 no longer needed, terminate now

Thu Sep 19 22:37:36:978 2019


DpHdlDeadWp: W16 (pid=12086) terminated automatically

Thu Sep 19 22:39:16:474 2019


DpWpCheck: dyn W18, pid 12662 no longer needed, terminate now

Thu Sep 19 22:39:17:064 2019


DpHdlDeadWp: W18 (pid=12662) terminated automatically

Thu Sep 19 22:42:06:860 2019


DpWpDynCreate: created new work process W20-15162

Thu Sep 19 22:47:08:694 2019


DpHdlDeadWp: W20 (pid=15162) terminated automatically

Thu Sep 19 22:48:02:582 2019


DpWpDynCreate: created new work process W19-17017

Thu Sep 19 22:48:04:992 2019


DpWpDynCreate: created new work process W17-17023

Thu Sep 19 22:53:05:052 2019


DpWpCheck: dyn W17, pid 17023 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=17017) terminated automatically

Thu Sep 19 22:53:06:124 2019


DpHdlDeadWp: W17 (pid=17023) terminated automatically

Thu Sep 19 22:56:13:583 2019


DpWpDynCreate: created new work process W16-19396

Thu Sep 19 23:01:16:513 2019


DpWpCheck: dyn W16, pid 19396 no longer needed, terminate now

Thu Sep 19 23:01:17:626 2019


DpHdlDeadWp: W16 (pid=19396) terminated automatically

Thu Sep 19 23:04:08:165 2019


DpWpDynCreate: created new work process W18-1130

Thu Sep 19 23:09:16:526 2019


DpWpCheck: dyn W18, pid 1130 no longer needed, terminate now

Thu Sep 19 23:09:17:070 2019


DpHdlDeadWp: W18 (pid=1130) terminated automatically

Thu Sep 19 23:13:28:719 2019


DpWpDynCreate: created new work process W20-20635

Thu Sep 19 23:18:36:544 2019


DpWpCheck: dyn W20, pid 20635 no longer needed, terminate now

Thu Sep 19 23:18:37:628 2019


DpHdlDeadWp: W20 (pid=20635) terminated automatically

Thu Sep 19 23:18:43:146 2019


DpWpDynCreate: created new work process W19-22727

Thu Sep 19 23:20:02:660 2019


DpWpDynCreate: created new work process W17-23183

Thu Sep 19 23:23:07:212 2019


DpWpDynCreate: created new work process W16-24302

Thu Sep 19 23:23:56:554 2019


DpWpCheck: dyn W19, pid 22727 no longer needed, terminate now

Thu Sep 19 23:23:59:692 2019


DpHdlDeadWp: W19 (pid=22727) terminated automatically

Thu Sep 19 23:25:03:741 2019


DpHdlDeadWp: W17 (pid=23183) terminated automatically

Thu Sep 19 23:28:09:091 2019


DpHdlDeadWp: W16 (pid=24302) terminated automatically

Thu Sep 19 23:31:28:702 2019


DpWpDynCreate: created new work process W18-27070

Thu Sep 19 23:33:06:571 2019


DpWpDynCreate: created new work process W20-27551

Thu Sep 19 23:33:49:079 2019


DpWpDynCreate: created new work process W19-27876

Thu Sep 19 23:36:36:573 2019


DpWpCheck: dyn W18, pid 27070 no longer needed, terminate now

Thu Sep 19 23:36:37:273 2019


DpHdlDeadWp: W18 (pid=27070) terminated automatically

Thu Sep 19 23:38:09:356 2019


DpHdlDeadWp: W20 (pid=27551) terminated automatically

Thu Sep 19 23:38:56:575 2019


DpWpCheck: dyn W19, pid 27876 no longer needed, terminate now

Thu Sep 19 23:38:57:464 2019


DpHdlDeadWp: W19 (pid=27876) terminated automatically

Thu Sep 19 23:39:02:586 2019


DpWpDynCreate: created new work process W17-29546

Thu Sep 19 23:39:02:740 2019


DpWpDynCreate: created new work process W16-29547

Thu Sep 19 23:44:03:808 2019


DpHdlDeadWp: W16 (pid=29547) terminated automatically
DpWpCheck: dyn W17, pid 29546 no longer needed, terminate now
Thu Sep 19 23:44:04:919 2019
DpHdlDeadWp: W17 (pid=29546) terminated automatically

Thu Sep 19 23:46:06:815 2019


DpWpDynCreate: created new work process W18-31622

Thu Sep 19 23:51:07:625 2019


DpHdlDeadWp: W18 (pid=31622) terminated automatically

Fri Sep 20 00:02:21:677 2019


DpWpDynCreate: created new work process W20-4655

Fri Sep 20 00:03:22:308 2019


DpWpDynCreate: created new work process W19-7910

Fri Sep 20 00:07:35:449 2019


DpHdlDeadWp: W20 (pid=4655) terminated automatically

Fri Sep 20 00:08:36:623 2019


DpWpCheck: dyn W19, pid 7910 no longer needed, terminate now

Fri Sep 20 00:08:38:642 2019


DpHdlDeadWp: W19 (pid=7910) terminated automatically

Fri Sep 20 00:19:14:993 2019


DpWpDynCreate: created new work process W16-5785

Fri Sep 20 00:20:04:330 2019


DpWpDynCreate: created new work process W17-6074

Fri Sep 20 00:24:15:960 2019


DpHdlDeadWp: W16 (pid=5785) terminated automatically

Fri Sep 20 00:25:05:232 2019


DpHdlDeadWp: W17 (pid=6074) terminated automatically

Fri Sep 20 00:27:26:234 2019


DpWpDynCreate: created new work process W18-11968

Fri Sep 20 00:32:36:660 2019


DpWpCheck: dyn W18, pid 11968 no longer needed, terminate now

Fri Sep 20 00:32:37:776 2019


DpHdlDeadWp: W18 (pid=11968) terminated automatically

Fri Sep 20 00:39:23:911 2019


DpWpDynCreate: created new work process W20-2679

Fri Sep 20 00:44:27:811 2019


DpWpDynCreate: created new work process W19-4598

Fri Sep 20 00:44:36:690 2019


DpWpCheck: dyn W20, pid 2679 no longer needed, terminate now

Fri Sep 20 00:44:37:550 2019


DpHdlDeadWp: W20 (pid=2679) terminated automatically

Fri Sep 20 00:48:30:527 2019


DpWpDynCreate: created new work process W16-6158

Fri Sep 20 00:49:36:697 2019


DpWpCheck: dyn W19, pid 4598 no longer needed, terminate now

Fri Sep 20 00:49:37:872 2019


DpHdlDeadWp: W19 (pid=4598) terminated automatically

Fri Sep 20 00:53:36:704 2019


DpWpCheck: dyn W16, pid 6158 no longer needed, terminate now

Fri Sep 20 00:53:37:129 2019


DpHdlDeadWp: W16 (pid=6158) terminated automatically

Fri Sep 20 00:55:14:102 2019


DpWpDynCreate: created new work process W17-8798

Fri Sep 20 01:00:15:543 2019


DpHdlDeadWp: W17 (pid=8798) terminated automatically

Fri Sep 20 01:01:07:198 2019


DpWpDynCreate: created new work process W18-22572

Fri Sep 20 01:03:29:752 2019


DpWpDynCreate: created new work process W20-29093

Fri Sep 20 01:06:09:404 2019


DpHdlDeadWp: W18 (pid=22572) terminated automatically

Fri Sep 20 01:08:08:590 2019


DpHdlDeadWp: W13 (pid=20790) terminated automatically
DpWpDynCreate: created new work process W13-9600

Fri Sep 20 01:08:36:729 2019


DpWpCheck: dyn W20, pid 29093 no longer needed, terminate now

Fri Sep 20 01:08:36:866 2019


DpHdlDeadWp: W20 (pid=29093) terminated automatically

Fri Sep 20 01:16:21:324 2019


DpWpDynCreate: created new work process W19-23351

Fri Sep 20 01:20:02:804 2019


DpWpDynCreate: created new work process W16-24796

Fri Sep 20 01:21:24:532 2019


DpHdlDeadWp: W19 (pid=23351) terminated automatically

Fri Sep 20 01:25:16:756 2019


DpWpCheck: dyn W16, pid 24796 no longer needed, terminate now

Fri Sep 20 01:25:16:975 2019


DpHdlDeadWp: W16 (pid=24796) terminated automatically

Fri Sep 20 01:26:21:618 2019


DpWpDynCreate: created new work process W17-27062

Fri Sep 20 01:31:23:582 2019


DpHdlDeadWp: W17 (pid=27062) terminated automatically
Fri Sep 20 01:33:33:781 2019
DpWpDynCreate: created new work process W18-29543

Fri Sep 20 01:35:07:837 2019


DpWpDynCreate: created new work process W20-30011

Fri Sep 20 01:38:36:777 2019


DpWpCheck: dyn W18, pid 29543 no longer needed, terminate now

Fri Sep 20 01:38:37:317 2019


DpHdlDeadWp: W18 (pid=29543) terminated automatically

Fri Sep 20 01:40:17:481 2019


DpHdlDeadWp: W20 (pid=30011) terminated automatically

Fri Sep 20 01:40:38:924 2019


DpHdlDeadWp: W11 (pid=27514) terminated automatically
DpWpDynCreate: created new work process W11-31686

Fri Sep 20 01:46:33:226 2019


DpWpDynCreate: created new work process W19-1329

Fri Sep 20 01:47:05:492 2019


DpWpDynCreate: created new work process W16-1544

Fri Sep 20 01:51:36:956 2019


DpHdlDeadWp: W19 (pid=1329) terminated automatically

Fri Sep 20 01:52:16:957 2019


DpWpCheck: dyn W16, pid 1544 no longer needed, terminate now

Fri Sep 20 01:52:18:022 2019


DpHdlDeadWp: W16 (pid=1544) terminated automatically

Fri Sep 20 01:54:09:271 2019


DpWpDynCreate: created new work process W17-4018

Fri Sep 20 01:59:16:968 2019


DpWpCheck: dyn W17, pid 4018 no longer needed, terminate now

Fri Sep 20 01:59:17:346 2019


DpHdlDeadWp: W17 (pid=4018) terminated automatically

Fri Sep 20 02:01:23:652 2019


*** ERROR => NiIRead: invalid data (0x300002f/0x8800;mode=0;hdl
52;peer=185.156.177.110:733;local=3200) [nixxi.cpp 5226]

Fri Sep 20 02:06:07:430 2019


DpWpDynCreate: created new work process W18-21674

Fri Sep 20 02:06:15:653 2019


DpWpDynCreate: created new work process W20-21704

Fri Sep 20 02:11:16:989 2019


DpWpCheck: dyn W18, pid 21674 no longer needed, terminate now
DpWpCheck: dyn W20, pid 21704 no longer needed, terminate now

Fri Sep 20 02:11:17:196 2019


DpHdlDeadWp: W18 (pid=21674) terminated automatically
DpHdlDeadWp: W20 (pid=21704) terminated automatically

Fri Sep 20 02:11:21:538 2019


DpWpDynCreate: created new work process W19-5039

Fri Sep 20 02:16:15:372 2019


DpWpDynCreate: created new work process W16-6744

Fri Sep 20 02:16:23:191 2019


DpHdlDeadWp: W19 (pid=5039) terminated automatically

Fri Sep 20 02:21:13:067 2019


DpWpDynCreate: created new work process W17-8656

Fri Sep 20 02:21:16:293 2019


DpHdlDeadWp: W16 (pid=6744) terminated automatically

Fri Sep 20 02:23:03:292 2019


DpWpDynCreate: created new work process W18-9282

Fri Sep 20 02:26:13:010 2019


DpWpDynCreate: created new work process W20-10194

Fri Sep 20 02:26:14:716 2019


DpHdlDeadWp: W17 (pid=8656) terminated automatically

Fri Sep 20 02:28:05:582 2019


DpHdlDeadWp: W18 (pid=9282) terminated automatically

Fri Sep 20 02:28:10:409 2019


DpWpDynCreate: created new work process W19-10805

Fri Sep 20 02:31:17:023 2019


DpWpCheck: dyn W20, pid 10194 no longer needed, terminate now

Fri Sep 20 02:31:17:214 2019


DpHdlDeadWp: W20 (pid=10194) terminated automatically

Fri Sep 20 02:33:17:027 2019


DpWpCheck: dyn W19, pid 10805 no longer needed, terminate now

Fri Sep 20 02:33:17:989 2019


DpHdlDeadWp: W19 (pid=10805) terminated automatically

Fri Sep 20 02:41:04:416 2019


DpWpDynCreate: created new work process W16-15187

Fri Sep 20 02:46:06:100 2019


DpHdlDeadWp: W16 (pid=15187) terminated automatically

Fri Sep 20 02:46:07:365 2019


DpWpDynCreate: created new work process W17-16632

Fri Sep 20 02:51:10:999 2019


DpHdlDeadWp: W17 (pid=16632) terminated automatically

Fri Sep 20 02:56:04:436 2019


DpWpDynCreate: created new work process W18-19756
Fri Sep 20 02:56:16:318 2019
DpWpDynCreate: created new work process W20-19877

Fri Sep 20 03:01:06:428 2019


DpHdlDeadWp: W18 (pid=19756) terminated automatically

Fri Sep 20 03:01:17:079 2019


DpWpCheck: dyn W20, pid 19877 no longer needed, terminate now

Fri Sep 20 03:01:17:367 2019


DpHdlDeadWp: W20 (pid=19877) terminated automatically

Fri Sep 20 03:06:18:076 2019


DpWpDynCreate: created new work process W19-5596

Fri Sep 20 03:11:19:340 2019


DpHdlDeadWp: W19 (pid=5596) terminated automatically

Fri Sep 20 03:13:03:506 2019


DpWpDynCreate: created new work process W16-21091

Fri Sep 20 03:18:04:519 2019


DpHdlDeadWp: W16 (pid=21091) terminated automatically

Fri Sep 20 03:20:07:461 2019


DpWpDynCreate: created new work process W17-23555

Fri Sep 20 03:21:20:712 2019


DpWpDynCreate: created new work process W18-23874

Fri Sep 20 03:25:14:754 2019


DpHdlDeadWp: W17 (pid=23555) terminated automatically

Fri Sep 20 03:26:22:766 2019


DpHdlDeadWp: W18 (pid=23874) terminated automatically

Fri Sep 20 03:27:10:097 2019


DpWpDynCreate: created new work process W20-26386

Fri Sep 20 03:27:14:541 2019


DpWpDynCreate: created new work process W19-26389

Fri Sep 20 03:32:12:392 2019


DpHdlDeadWp: W20 (pid=26386) terminated automatically

Fri Sep 20 03:32:15:341 2019


DpHdlDeadWp: W19 (pid=26389) terminated automatically

Fri Sep 20 03:32:50:896 2019


DpWpDynCreate: created new work process W16-28140

Fri Sep 20 03:33:01:501 2019


DpWpDynCreate: created new work process W17-28256

Fri Sep 20 03:33:29:033 2019


DpHdlSoftCancel: cancel request for T4_U1520_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CANCEL_REQUEST)
Fri Sep 20 03:37:10:064 2019
DpWpDynCreate: created new work process W18-29630

Fri Sep 20 03:37:57:152 2019


DpWpCheck: dyn W16, pid 28140 no longer needed, terminate now

Fri Sep 20 03:37:59:335 2019


DpHdlDeadWp: W16 (pid=28140) terminated automatically

Fri Sep 20 03:38:02:662 2019


DpHdlDeadWp: W17 (pid=28256) terminated automatically

Fri Sep 20 03:42:04:383 2019


DpWpDynCreate: created new work process W20-31034

Fri Sep 20 03:42:14:153 2019


DpHdlDeadWp: W18 (pid=29630) terminated automatically

Fri Sep 20 03:42:14:605 2019


DpWpDynCreate: created new work process W19-31041

Fri Sep 20 03:47:06:650 2019


DpHdlDeadWp: W20 (pid=31034) terminated automatically

Fri Sep 20 03:47:17:053 2019


DpWpDynCreate: created new work process W16-450

Fri Sep 20 03:47:17:172 2019


DpWpCheck: dyn W19, pid 31041 no longer needed, terminate now

Fri Sep 20 03:47:17:450 2019


DpHdlDeadWp: W19 (pid=31041) terminated automatically

Fri Sep 20 03:47:34:305 2019


DpWpDynCreate: created new work process W17-628

Fri Sep 20 03:52:18:556 2019


DpHdlDeadWp: W16 (pid=450) terminated automatically

Fri Sep 20 03:52:37:183 2019


DpWpCheck: dyn W17, pid 628 no longer needed, terminate now

Fri Sep 20 03:52:37:980 2019


DpHdlDeadWp: W17 (pid=628) terminated automatically

Fri Sep 20 03:57:04:571 2019


DpWpDynCreate: created new work process W18-3847

Fri Sep 20 04:02:07:184 2019


DpHdlDeadWp: W18 (pid=3847) terminated automatically

Fri Sep 20 04:02:07:748 2019


DpWpDynCreate: created new work process W20-8446

Fri Sep 20 04:04:11:816 2019


DpWpDynCreate: created new work process W19-17840

Fri Sep 20 04:07:17:217 2019


DpWpCheck: dyn W20, pid 8446 no longer needed, terminate now
Fri Sep 20 04:07:17:971 2019
DpHdlDeadWp: W20 (pid=8446) terminated automatically

Fri Sep 20 04:08:06:781 2019


DpWpDynCreate: created new work process W16-365

Fri Sep 20 04:08:16:782 2019


DpWpDynCreate: created new work process W17-704

Fri Sep 20 04:09:17:220 2019


DpWpCheck: dyn W19, pid 17840 no longer needed, terminate now

Fri Sep 20 04:09:18:067 2019


DpHdlDeadWp: W19 (pid=17840) terminated automatically

Fri Sep 20 04:13:07:842 2019


DpHdlDeadWp: W16 (pid=365) terminated automatically

Fri Sep 20 04:13:09:658 2019


DpWpDynCreate: created new work process W18-4704

Fri Sep 20 04:13:17:474 2019


DpHdlDeadWp: W17 (pid=704) terminated automatically

Fri Sep 20 04:18:10:701 2019


DpHdlDeadWp: W18 (pid=4704) terminated automatically

Fri Sep 20 04:21:05:397 2019


DpWpDynCreate: created new work process W20-7668

Fri Sep 20 04:23:14:279 2019


DpWpDynCreate: created new work process W19-8561

Fri Sep 20 04:23:23:216 2019


DpWpDynCreate: created new work process W16-8668

Fri Sep 20 04:26:07:784 2019


DpHdlDeadWp: W20 (pid=7668) terminated automatically

Fri Sep 20 04:28:15:521 2019


DpHdlDeadWp: W19 (pid=8561) terminated automatically

Fri Sep 20 04:28:26:164 2019


DpHdlDeadWp: W16 (pid=8668) terminated automatically

Fri Sep 20 04:32:12:027 2019


DpWpDynCreate: created new work process W17-11586

Fri Sep 20 04:33:28:319 2019


DpWpDynCreate: created new work process W18-11991

Fri Sep 20 04:34:37:119 2019


DpWpDynCreate: created new work process W20-12543

Fri Sep 20 04:37:16:836 2019


DpHdlDeadWp: W17 (pid=11586) terminated automatically

Fri Sep 20 04:38:37:274 2019


DpWpCheck: dyn W18, pid 11991 no longer needed, terminate now

Fri Sep 20 04:38:37:600 2019


DpHdlDeadWp: W18 (pid=11991) terminated automatically

Fri Sep 20 04:39:50:859 2019


DpHdlDeadWp: W20 (pid=12543) terminated automatically

Fri Sep 20 04:43:21:713 2019


DpWpDynCreate: created new work process W19-15314

Fri Sep 20 04:43:58:354 2019


DpHdlSoftCancel: cancel request for T126_U16992_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CANCEL_REQUEST)
DpHdlSoftCancel: cancel request for T134_U17143_M0 received from GATEWAY
(reason=DP_SOFTCANCEL_ABORT_PROGRAM)

Fri Sep 20 04:44:04:839 2019


DpWpDynCreate: created new work process W16-15491

Fri Sep 20 04:47:29:525 2019


DpHdlDeadWp: W9 (pid=5658) terminated automatically
DpWpDynCreate: created new work process W9-16737

Fri Sep 20 04:48:37:290 2019


DpWpCheck: dyn W19, pid 15314 no longer needed, terminate now

Fri Sep 20 04:48:37:702 2019


DpHdlDeadWp: W19 (pid=15314) terminated automatically

Fri Sep 20 04:49:05:853 2019


DpHdlDeadWp: W16 (pid=15491) terminated automatically

Fri Sep 20 04:49:10:014 2019


DpWpDynCreate: created new work process W17-17211

Fri Sep 20 04:54:17:299 2019


DpWpCheck: dyn W17, pid 17211 no longer needed, terminate now

Fri Sep 20 04:54:18:176 2019


DpHdlDeadWp: W17 (pid=17211) terminated automatically

Fri Sep 20 04:56:05:863 2019


DpWpDynCreate: created new work process W18-19428

Fri Sep 20 05:00:02:980 2019


DpWpDynCreate: created new work process W20-20822

Fri Sep 20 05:01:09:395 2019


DpHdlDeadWp: W18 (pid=19428) terminated automatically

Fri Sep 20 05:05:04:188 2019


DpHdlDeadWp: W20 (pid=20822) terminated automatically

Fri Sep 20 05:17:01:966 2019


DpWpDynCreate: created new work process W19-21935

Fri Sep 20 05:18:17:346 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T153_U29246 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T153_U29246_M0 |001|SOLMAN_ADMIN|SST-LAP-HP0055 |04:36:52|2 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T153_U29246_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)
DpHdlSoftCancel: cancel request for T153_U29246_M1 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Fri Sep 20 05:22:03:056 2019


DpHdlDeadWp: W19 (pid=21935) terminated automatically

Fri Sep 20 05:26:02:749 2019


DpWpDynCreate: created new work process W16-25236

Fri Sep 20 05:27:02:886 2019


DpWpDynCreate: created new work process W17-25555

Fri Sep 20 05:27:03:158 2019


DpWpDynCreate: created new work process W18-25556

Fri Sep 20 05:31:17:364 2019


DpWpCheck: dyn W16, pid 25236 no longer needed, terminate now

Fri Sep 20 05:31:17:570 2019


DpHdlDeadWp: W16 (pid=25236) terminated automatically

Fri Sep 20 05:32:04:166 2019


DpHdlDeadWp: W17 (pid=25555) terminated automatically

Fri Sep 20 05:32:04:682 2019


DpHdlDeadWp: W18 (pid=25556) terminated automatically

Fri Sep 20 05:34:16:269 2019


DpWpDynCreate: created new work process W20-27950

Fri Sep 20 05:39:06:643 2019


DpWpDynCreate: created new work process W19-29619

Fri Sep 20 05:39:17:375 2019


DpWpCheck: dyn W20, pid 27950 no longer needed, terminate now

Fri Sep 20 05:39:17:489 2019


DpHdlDeadWp: W20 (pid=27950) terminated automatically

Fri Sep 20 05:44:17:383 2019


DpWpCheck: dyn W19, pid 29619 no longer needed, terminate now

Fri Sep 20 05:44:17:803 2019


DpHdlDeadWp: W19 (pid=29619) terminated automatically

Fri Sep 20 05:47:22:452 2019


DpWpDynCreate: created new work process W16-32202

Fri Sep 20 05:52:37:397 2019


DpWpCheck: dyn W16, pid 32202 no longer needed, terminate now

Fri Sep 20 05:52:38:275 2019


DpHdlDeadWp: W16 (pid=32202) terminated automatically
Fri Sep 20 05:53:15:149 2019
DpWpDynCreate: created new work process W17-1756

Fri Sep 20 05:58:17:407 2019


DpWpCheck: dyn W17, pid 1756 no longer needed, terminate now

Fri Sep 20 05:58:17:510 2019


DpHdlDeadWp: W17 (pid=1756) terminated automatically

Fri Sep 20 06:02:46:621 2019


DpWpDynCreate: created new work process W18-10583

Fri Sep 20 06:07:57:422 2019


DpWpCheck: dyn W18, pid 10583 no longer needed, terminate now

Fri Sep 20 06:07:58:095 2019


DpHdlDeadWp: W18 (pid=10583) terminated automatically

Fri Sep 20 06:08:12:531 2019


DpWpDynCreate: created new work process W20-30299

Fri Sep 20 06:13:14:141 2019


DpHdlDeadWp: W20 (pid=30299) terminated automatically

Fri Sep 20 06:13:17:352 2019


DpWpDynCreate: created new work process W19-4381

Fri Sep 20 06:18:19:312 2019


DpHdlDeadWp: W19 (pid=4381) terminated automatically
DpWpDynCreate: created new work process W19-6225

Fri Sep 20 06:18:36:737 2019


DpWpDynCreate: created new work process W16-6345

Fri Sep 20 06:23:20:532 2019


DpHdlDeadWp: W19 (pid=6225) terminated automatically

Fri Sep 20 06:23:37:824 2019


DpHdlDeadWp: W16 (pid=6345) terminated automatically

Fri Sep 20 06:27:06:969 2019


DpWpDynCreate: created new work process W17-9207

Fri Sep 20 06:27:07:079 2019


DpWpDynCreate: created new work process W18-9208

Fri Sep 20 06:32:07:774 2019


DpHdlDeadWp: W17 (pid=9207) terminated automatically

Fri Sep 20 06:32:08:634 2019


DpHdlDeadWp: W18 (pid=9208) terminated automatically

Fri Sep 20 06:33:20:434 2019


DpWpDynCreate: created new work process W20-10985

Fri Sep 20 06:38:37:477 2019


DpWpCheck: dyn W20, pid 10985 no longer needed, terminate now
Fri Sep 20 06:38:38:055 2019
DpHdlDeadWp: W20 (pid=10985) terminated automatically

Fri Sep 20 06:41:05:060 2019


DpWpDynCreate: created new work process W19-13504

Fri Sep 20 06:46:06:272 2019


DpHdlDeadWp: W19 (pid=13504) terminated automatically

Fri Sep 20 06:47:07:417 2019


DpWpDynCreate: created new work process W16-15488
Directory: /usr/sap/SMP/DVEBMGS00/work
Name: dev_disp
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
-----------------------------------------------------------------------------------
--------------
Fri Sep 20 06:49:03:992 2019
DpWpDynCreate: created new work process W17-16182

Fri Sep 20 06:52:17:499 2019


DpWpCheck: dyn W16, pid 15488 no longer needed, terminate now

Fri Sep 20 06:52:17:960 2019


DpHdlDeadWp: W16 (pid=15488) terminated automatically

Fri Sep 20 06:54:04:310 2019


DpHdlDeadWp: W17 (pid=16182) terminated automatically

Fri Sep 20 06:55:44:814 2019


DpHdlDeadWp: W11 (pid=31686) terminated automatically
DpWpDynCreate: created new work process W11-18136

Fri Sep 20 06:56:49:115 2019


DpHdlDeadWp: W12 (pid=23319) terminated automatically
DpWpDynCreate: created new work process W12-18607

Fri Sep 20 06:56:52:908 2019


DpWpDynCreate: created new work process W18-18614

Fri Sep 20 07:01:57:511 2019


DpWpCheck: dyn W18, pid 18614 no longer needed, terminate now

Fri Sep 20 07:01:58:560 2019


DpHdlDeadWp: W18 (pid=18614) terminated automatically

Fri Sep 20 07:02:07:823 2019


DpWpDynCreate: created new work process W20-23118

Fri Sep 20 07:07:17:520 2019


DpWpCheck: dyn W20, pid 23118 no longer needed, terminate now

Fri Sep 20 07:07:17:919 2019


DpHdlDeadWp: W20 (pid=23118) terminated automatically

Fri Sep 20 07:08:03:027 2019


DpWpDynCreate: created new work process W19-12324

Fri Sep 20 07:13:12:339 2019


DpHdlDeadWp: W19 (pid=12324) terminated automatically

Fri Sep 20 07:15:09:404 2019


DpWpDynCreate: created new work process W16-20320

Fri Sep 20 07:20:10:763 2019


DpHdlDeadWp: W16 (pid=20320) terminated automatically

Fri Sep 20 07:20:11:285 2019


DpWpDynCreate: created new work process W17-22021

Fri Sep 20 07:20:12:184 2019


DpWpDynCreate: created new work process W18-22024

Fri Sep 20 07:25:12:790 2019


DpHdlDeadWp: W17 (pid=22021) terminated automatically

Fri Sep 20 07:25:14:951 2019


DpHdlDeadWp: W18 (pid=22024) terminated automatically

Fri Sep 20 07:35:14:444 2019


DpWpDynCreate: created new work process W20-27229

Fri Sep 20 07:35:17:204 2019


DpWpDynCreate: created new work process W19-27237

Fri Sep 20 07:40:17:582 2019


DpWpCheck: dyn W20, pid 27229 no longer needed, terminate now

Fri Sep 20 07:40:17:834 2019


DpHdlDeadWp: W20 (pid=27229) terminated automatically

Fri Sep 20 07:40:37:582 2019


DpWpCheck: dyn W19, pid 27237 no longer needed, terminate now

Fri Sep 20 07:40:37:942 2019


DpHdlDeadWp: W19 (pid=27237) terminated automatically

Fri Sep 20 07:46:23:885 2019


DpWpDynCreate: created new work process W16-30794

Fri Sep 20 07:48:07:486 2019


DpWpDynCreate: created new work process W17-31321

Fri Sep 20 07:51:37:605 2019


DpWpCheck: dyn W16, pid 30794 no longer needed, terminate now

Fri Sep 20 07:51:38:510 2019


DpHdlDeadWp: W16 (pid=30794) terminated automatically

Fri Sep 20 07:53:05:860 2019


DpWpDynCreate: created new work process W18-476

Fri Sep 20 07:53:17:610 2019


DpWpCheck: dyn W17, pid 31321 no longer needed, terminate now
Fri Sep 20 07:53:18:692 2019
DpHdlDeadWp: W17 (pid=31321) terminated automatically

Fri Sep 20 07:55:08:796 2019


DpHdlDeadWp: W12 (pid=18607) terminated automatically
DpWpDynCreate: created new work process W12-1258

Fri Sep 20 07:56:19:499 2019


DpWpDynCreate: created new work process W20-1571

Fri Sep 20 07:58:07:073 2019


DpHdlDeadWp: W18 (pid=476) terminated automatically

Fri Sep 20 08:01:23:748 2019


DpHdlDeadWp: W20 (pid=1571) terminated automatically

Fri Sep 20 08:08:03:156 2019


DpWpDynCreate: created new work process W19-29731

Fri Sep 20 08:08:03:441 2019


DpWpDynCreate: created new work process W16-29751

Fri Sep 20 08:11:12:622 2019


DpWpDynCreate: created new work process W17-1855

Fri Sep 20 08:13:04:494 2019


DpHdlDeadWp: W16 (pid=29751) terminated automatically
DpHdlDeadWp: W19 (pid=29731) terminated automatically

Fri Sep 20 08:16:17:650 2019


DpWpCheck: dyn W17, pid 1855 no longer needed, terminate now

Fri Sep 20 08:16:18:033 2019


DpHdlDeadWp: W17 (pid=1855) terminated automatically

Fri Sep 20 08:21:20:477 2019


DpWpDynCreate: created new work process W18-5653

Fri Sep 20 08:26:37:666 2019


DpWpCheck: dyn W18, pid 5653 no longer needed, terminate now

Fri Sep 20 08:26:37:997 2019


DpHdlDeadWp: W18 (pid=5653) terminated automatically

Fri Sep 20 08:27:09:452 2019


DpWpDynCreate: created new work process W20-8119

Fri Sep 20 08:27:27:493 2019


DpWpDynCreate: created new work process W16-8200

Fri Sep 20 08:27:27:926 2019


DpWpDynCreate: created new work process W19-8201

Fri Sep 20 08:30:57:109 2019


*** ERROR => NiIRead: invalid data (0x300002f/0x8800;mode=0;hdl
47;peer=78.128.113.18:3445;local=3200) [nixxi.cpp 5226]

Fri Sep 20 08:32:17:678 2019


DpWpCheck: dyn W20, pid 8119 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=8119) terminated automatically

Fri Sep 20 08:32:37:678 2019


DpWpCheck: dyn W16, pid 8200 no longer needed, terminate now
DpWpCheck: dyn W19, pid 8201 no longer needed, terminate now

Fri Sep 20 08:32:38:375 2019


DpHdlDeadWp: W16 (pid=8200) terminated automatically
DpHdlDeadWp: W19 (pid=8201) terminated automatically

Fri Sep 20 08:37:16:456 2019


DpWpDynCreate: created new work process W17-11420

Fri Sep 20 08:37:28:745 2019


DpWpDynCreate: created new work process W18-11426

Fri Sep 20 08:42:17:703 2019


DpWpCheck: dyn W17, pid 11420 no longer needed, terminate now

Fri Sep 20 08:42:17:890 2019


DpHdlDeadWp: W17 (pid=11420) terminated automatically

Fri Sep 20 08:42:37:545 2019


DpHdlDeadWp: W18 (pid=11426) terminated automatically

Fri Sep 20 08:45:02:953 2019


DpWpDynCreate: created new work process W20-13932
DpWpDynCreate: created new work process W16-13933

Fri Sep 20 08:50:03:535 2019


DpHdlDeadWp: W16 (pid=13933) terminated automatically
DpHdlDeadWp: W20 (pid=13932) terminated automatically
DpWpDynCreate: created new work process W20-15426

Fri Sep 20 08:55:03:022 2019


DpWpDynCreate: created new work process W19-17217
DpWpDynCreate: created new work process W17-17218

Fri Sep 20 08:55:04:567 2019


DpHdlDeadWp: W20 (pid=15426) terminated automatically

Fri Sep 20 09:00:04:472 2019


DpHdlDeadWp: W19 (pid=17217) terminated automatically

Fri Sep 20 09:00:06:516 2019


DpHdlDeadWp: W17 (pid=17218) terminated automatically

Fri Sep 20 09:00:25:858 2019


DpHdlDeadWp: W13 (pid=9600) terminated automatically
DpWpDynCreate: created new work process W13-18887

Fri Sep 20 09:02:06:612 2019


DpWpDynCreate: created new work process W18-22352

Fri Sep 20 09:07:07:837 2019


DpHdlDeadWp: W18 (pid=22352) terminated automatically

Fri Sep 20 09:09:02:978 2019


DpWpDynCreate: created new work process W16-17188
DpWpDynCreate: created new work process W20-17189

Fri Sep 20 09:09:03:076 2019


DpWpDynCreate: created new work process W19-17190

Fri Sep 20 09:14:03:069 2019


DpWpDynCreate: created new work process W17-18954

Fri Sep 20 09:14:03:953 2019


DpHdlDeadWp: W16 (pid=17188) terminated automatically
DpHdlDeadWp: W20 (pid=17189) terminated automatically

Fri Sep 20 09:14:06:495 2019


DpHdlDeadWp: W19 (pid=17190) terminated automatically

Fri Sep 20 09:16:02:970 2019


DpWpDynCreate: created new work process W18-19650

Fri Sep 20 09:19:05:003 2019


DpHdlDeadWp: W17 (pid=18954) terminated automatically

Fri Sep 20 09:21:03:904 2019


DpHdlDeadWp: W18 (pid=19650) terminated automatically

Fri Sep 20 09:22:17:664 2019


DpWpDynCreate: created new work process W16-21979

Fri Sep 20 09:26:05:927 2019


DpWpDynCreate: created new work process W20-23226

Fri Sep 20 09:27:20:585 2019


DpHdlDeadWp: W16 (pid=21979) terminated automatically

Fri Sep 20 09:31:07:352 2019


DpHdlDeadWp: W20 (pid=23226) terminated automatically

Fri Sep 20 09:33:17:794 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for T8_U5677
(60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T8_U5677_M0 |001|SOLMAN_ADMIN|SST-LAP-LEN0028 |08:31:32|16 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T8_U5677_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Fri Sep 20 09:36:10:280 2019


DpWpDynCreate: created new work process W19-26683

Fri Sep 20 09:38:20:085 2019


DpWpDynCreate: created new work process W17-27329

Fri Sep 20 09:41:19:446 2019


DpHdlDeadWp: W19 (pid=26683) terminated automatically

Fri Sep 20 09:43:23:223 2019


DpHdlDeadWp: W17 (pid=27329) terminated automatically

Fri Sep 20 09:44:22:687 2019


DpWpDynCreate: created new work process W18-29402
Fri Sep 20 09:48:25:936 2019
DpWpDynCreate: created new work process W16-30683

Fri Sep 20 09:49:37:821 2019


DpWpCheck: dyn W18, pid 29402 no longer needed, terminate now

Fri Sep 20 09:49:38:282 2019


DpHdlDeadWp: W18 (pid=29402) terminated automatically

Fri Sep 20 09:53:37:828 2019


DpWpCheck: dyn W16, pid 30683 no longer needed, terminate now

Fri Sep 20 09:53:38:078 2019


DpHdlDeadWp: W16 (pid=30683) terminated automatically

Fri Sep 20 09:56:03:024 2019


DpWpDynCreate: created new work process W20-785

Fri Sep 20 10:01:17:840 2019


DpWpCheck: dyn W20, pid 785 no longer needed, terminate now

Fri Sep 20 10:01:18:410 2019


DpHdlDeadWp: W20 (pid=785) terminated automatically

Fri Sep 20 10:03:40:665 2019


DpWpDynCreate: created new work process W19-13510

Fri Sep 20 10:08:43:853 2019


DpHdlDeadWp: W19 (pid=13510) terminated automatically

Fri Sep 20 10:09:04:536 2019


DpWpDynCreate: created new work process W17-580

Fri Sep 20 10:14:05:899 2019


DpHdlDeadWp: W17 (pid=580) terminated automatically

Fri Sep 20 10:14:23:411 2019


DpWpDynCreate: created new work process W18-2812

Fri Sep 20 10:19:37:878 2019


DpWpCheck: dyn W18, pid 2812 no longer needed, terminate now

Fri Sep 20 10:19:38:612 2019


DpHdlDeadWp: W18 (pid=2812) terminated automatically

Fri Sep 20 10:20:30:134 2019


DpWpDynCreate: created new work process W16-4961

Fri Sep 20 10:21:39:791 2019


DpWpDynCreate: created new work process W20-5526

Fri Sep 20 10:25:35:115 2019


DpHdlDeadWp: W16 (pid=4961) terminated automatically

Fri Sep 20 10:26:57:890 2019


DpWpCheck: dyn W20, pid 5526 no longer needed, terminate now

Fri Sep 20 10:26:58:289 2019


DpHdlDeadWp: W20 (pid=5526) terminated automatically

Fri Sep 20 10:27:07:819 2019


DpWpDynCreate: created new work process W19-7468

Fri Sep 20 10:32:09:341 2019


DpHdlDeadWp: W19 (pid=7468) terminated automatically

Fri Sep 20 10:34:14:002 2019


DpWpDynCreate: created new work process W17-9770

Fri Sep 20 10:37:03:027 2019


DpWpDynCreate: created new work process W18-10842

Fri Sep 20 10:39:15:885 2019


DpHdlDeadWp: W17 (pid=9770) terminated automatically

Fri Sep 20 10:42:05:219 2019


DpHdlDeadWp: W18 (pid=10842) terminated automatically

Fri Sep 20 10:55:21:741 2019


DpWpDynCreate: created new work process W16-16463

Fri Sep 20 10:59:37:939 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T72_U6678 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T72_U6678_M0 |001|SOLMAN_ADMIN|SST-LAP-HP0055 |09:57:47|7 |
SAPMSUU0 |high| |
|SU01 |
DpHdlSoftCancel: cancel request for T72_U6678_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Fri Sep 20 11:00:22:235 2019


DpHdlDeadWp: W16 (pid=16463) terminated automatically

Fri Sep 20 11:00:26:207 2019


DpWpDynCreate: created new work process W20-18235

Fri Sep 20 11:05:37:945 2019


DpWpCheck: dyn W20, pid 18235 no longer needed, terminate now

Fri Sep 20 11:05:38:454 2019


DpHdlDeadWp: W20 (pid=18235) terminated automatically

Fri Sep 20 11:12:59:940 2019


DpWpDynCreate: created new work process W19-17957

Fri Sep 20 11:16:03:264 2019


DpWpDynCreate: created new work process W17-19185

Fri Sep 20 11:16:03:376 2019


DpWpDynCreate: created new work process W18-19187

Fri Sep 20 11:18:03:571 2019


DpHdlDeadWp: W19 (pid=17957) terminated automatically

Fri Sep 20 11:21:04:902 2019


DpHdlDeadWp: W17 (pid=19185) terminated automatically
Fri Sep 20 11:21:05:151 2019
DpHdlDeadWp: W18 (pid=19187) terminated automatically

Fri Sep 20 11:21:05:302 2019


DpWpDynCreate: created new work process W16-20964

Fri Sep 20 11:21:11:323 2019


DpWpDynCreate: created new work process W20-20970

Fri Sep 20 11:26:12:544 2019


DpWpCheck: dyn W16, pid 20964 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=20970) terminated automatically

Fri Sep 20 11:26:12:668 2019


DpHdlDeadWp: W16 (pid=20964) terminated automatically

Fri Sep 20 11:28:19:295 2019


DpWpDynCreate: created new work process W19-23365

Fri Sep 20 11:30:54:536 2019


*** ERROR => NiIRead: invalid data (0x300002f/0x8800;mode=0;hdl
66;peer=45.136.108.28:470;local=3200) [nixxi.cpp 5226]

Fri Sep 20 11:33:25:330 2019


DpHdlDeadWp: W19 (pid=23365) terminated automatically

Fri Sep 20 11:34:02:849 2019


DpWpDynCreate: created new work process W17-25534

Fri Sep 20 11:39:18:000 2019


DpWpCheck: dyn W17, pid 25534 no longer needed, terminate now

Fri Sep 20 11:39:18:807 2019


DpHdlDeadWp: W17 (pid=25534) terminated automatically

Fri Sep 20 11:40:12:882 2019


DpWpDynCreate: created new work process W18-27479

Fri Sep 20 11:40:22:469 2019


DpWpDynCreate: created new work process W20-27487

Fri Sep 20 11:40:22:674 2019


DpWpDynCreate: created new work process W16-27488

Fri Sep 20 11:45:15:179 2019


DpHdlDeadWp: W18 (pid=27479) terminated automatically

Fri Sep 20 11:45:38:010 2019


DpWpCheck: dyn W16, pid 27488 no longer needed, terminate now
DpWpCheck: dyn W20, pid 27487 no longer needed, terminate now

Fri Sep 20 11:45:39:084 2019


DpHdlDeadWp: W16 (pid=27488) terminated automatically
DpHdlDeadWp: W20 (pid=27487) terminated automatically

Fri Sep 20 11:46:05:282 2019


DpWpDynCreate: created new work process W19-29459

Fri Sep 20 11:51:08:837 2019


DpHdlDeadWp: W19 (pid=29459) terminated automatically

Fri Sep 20 11:55:09:254 2019


DpWpDynCreate: created new work process W17-32411
DpWpDynCreate: created new work process W18-32412

Fri Sep 20 12:00:11:025 2019


DpWpCheck: dyn W17, pid 32411 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=32412) terminated automatically
DpHdlDeadWp: W17 (pid=32411) terminated automatically

Fri Sep 20 12:00:14:203 2019


DpWpDynCreate: created new work process W16-1537

Fri Sep 20 12:05:18:046 2019


DpWpCheck: dyn W16, pid 1537 no longer needed, terminate now

Fri Sep 20 12:05:18:573 2019


DpHdlDeadWp: W16 (pid=1537) terminated automatically

Fri Sep 20 12:10:04:319 2019


DpWpDynCreate: created new work process W20-351

Fri Sep 20 12:15:05:505 2019


DpHdlDeadWp: W20 (pid=351) terminated automatically

Fri Sep 20 12:15:07:357 2019


DpWpDynCreate: created new work process W19-2426

Fri Sep 20 12:18:25:407 2019


DpWpDynCreate: created new work process W18-3622

Fri Sep 20 12:20:12:961 2019


DpHdlDeadWp: W19 (pid=2426) terminated automatically

Fri Sep 20 12:20:15:140 2019


DpWpDynCreate: created new work process W17-4245

Fri Sep 20 12:20:38:074 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T98_U4834 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T98_U4834_M0 |001|SOLMAN_ADMIN|SST-LAP-HP0055 |11:18:41|18 |
SBAL_DISPLAY |high| |
|SLG1 |
DpHdlSoftCancel: cancel request for T98_U4834_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Fri Sep 20 12:23:38:079 2019


DpWpCheck: dyn W18, pid 3622 no longer needed, terminate now

Fri Sep 20 12:23:38:522 2019


DpHdlDeadWp: W18 (pid=3622) terminated automatically

Fri Sep 20 12:25:10:655 2019


DpWpDynCreate: created new work process W16-6032

Fri Sep 20 12:25:18:080 2019


DpWpCheck: dyn W17, pid 4245 no longer needed, terminate now
Fri Sep 20 12:25:18:504 2019
DpHdlDeadWp: W17 (pid=4245) terminated automatically

Fri Sep 20 12:28:06:219 2019


DpWpDynCreate: created new work process W20-7100

Fri Sep 20 12:30:18:087 2019


DpWpCheck: dyn W16, pid 6032 no longer needed, terminate now

Fri Sep 20 12:30:18:384 2019


DpHdlDeadWp: W16 (pid=6032) terminated automatically

Fri Sep 20 12:33:08:269 2019


DpHdlDeadWp: W20 (pid=7100) terminated automatically

Fri Sep 20 12:35:03:786 2019


DpWpDynCreate: created new work process W19-9496

Fri Sep 20 12:35:19:959 2019


DpWpDynCreate: created new work process W18-9681

Fri Sep 20 12:40:18:103 2019


DpWpCheck: dyn W19, pid 9496 no longer needed, terminate now

Fri Sep 20 12:40:18:341 2019


DpHdlDeadWp: W19 (pid=9496) terminated automatically

Fri Sep 20 12:40:20:403 2019


DpHdlDeadWp: W18 (pid=9681) terminated automatically

Fri Sep 20 12:40:22:463 2019


DpWpDynCreate: created new work process W17-11357

Fri Sep 20 12:45:28:614 2019


DpHdlDeadWp: W17 (pid=11357) terminated automatically

Fri Sep 20 12:50:13:145 2019


DpWpDynCreate: created new work process W16-14615

Fri Sep 20 12:55:15:902 2019


DpHdlDeadWp: W16 (pid=14615) terminated automatically

Fri Sep 20 12:57:07:861 2019


DpWpDynCreate: created new work process W20-16670

Fri Sep 20 13:00:13:824 2019


DpWpDynCreate: created new work process W19-17713

Fri Sep 20 13:02:18:139 2019


DpWpCheck: dyn W20, pid 16670 no longer needed, terminate now

Fri Sep 20 13:02:18:696 2019


DpHdlDeadWp: W20 (pid=16670) terminated automatically

Fri Sep 20 13:05:18:144 2019


DpWpCheck: dyn W19, pid 17713 no longer needed, terminate now

Fri Sep 20 13:05:18:951 2019


DpHdlDeadWp: W19 (pid=17713) terminated automatically
Fri Sep 20 13:18:06:020 2019
DpWpDynCreate: created new work process W18-19037

Fri Sep 20 13:23:08:338 2019


DpHdlDeadWp: W18 (pid=19037) terminated automatically

Fri Sep 20 13:25:47:326 2019


DpWpDynCreate: created new work process W17-21929

Fri Sep 20 13:30:58:190 2019


DpWpCheck: dyn W17, pid 21929 no longer needed, terminate now

Fri Sep 20 13:30:59:224 2019


DpHdlDeadWp: W17 (pid=21929) terminated automatically

Fri Sep 20 13:31:03:784 2019


DpWpDynCreate: created new work process W16-23617

Fri Sep 20 13:36:18:199 2019


DpWpCheck: dyn W16, pid 23617 no longer needed, terminate now

Fri Sep 20 13:36:18:577 2019


DpHdlDeadWp: W16 (pid=23617) terminated automatically

Fri Sep 20 13:37:21:422 2019


DpWpDynCreate: created new work process W20-25816

Fri Sep 20 13:37:22:461 2019


DpWpDynCreate: created new work process W19-25819

Fri Sep 20 13:42:22:533 2019


DpHdlDeadWp: W20 (pid=25816) terminated automatically

Fri Sep 20 13:42:24:110 2019


DpHdlDeadWp: W19 (pid=25819) terminated automatically

Fri Sep 20 13:48:05:185 2019


DpWpDynCreate: created new work process W18-29269

Fri Sep 20 13:53:07:352 2019


DpHdlDeadWp: W18 (pid=29269) terminated automatically

Fri Sep 20 13:53:07:946 2019


DpWpDynCreate: created new work process W17-30848

Fri Sep 20 13:58:06:438 2019


DpHdlDeadWp: W9 (pid=16737) terminated automatically
DpWpDynCreate: created new work process W9-32536

Fri Sep 20 13:58:26:643 2019


DpHdlDeadWp: W17 (pid=30848) terminated automatically

Fri Sep 20 14:00:03:431 2019


DpWpDynCreate: created new work process W16-762

Fri Sep 20 14:05:18:246 2019


DpWpCheck: dyn W16, pid 762 no longer needed, terminate now
Fri Sep 20 14:05:19:060 2019
DpHdlDeadWp: W16 (pid=762) terminated automatically

Fri Sep 20 14:12:03:320 2019


DpWpDynCreate: created new work process W20-32531

Fri Sep 20 14:12:17:090 2019


DpWpDynCreate: created new work process W19-32736

Fri Sep 20 14:17:04:167 2019


DpHdlDeadWp: W20 (pid=32531) terminated automatically

Fri Sep 20 14:17:18:265 2019


DpWpCheck: dyn W19, pid 32736 no longer needed, terminate now

Fri Sep 20 14:17:18:643 2019


DpHdlDeadWp: W19 (pid=32736) terminated automatically

Fri Sep 20 14:22:30:106 2019


DpWpDynCreate: created new work process W18-3993

Fri Sep 20 14:26:04:163 2019


DpWpDynCreate: created new work process W17-5488

Fri Sep 20 14:27:33:197 2019


DpHdlDeadWp: W18 (pid=3993) terminated automatically

Fri Sep 20 14:31:05:404 2019


DpHdlDeadWp: W17 (pid=5488) terminated automatically

Fri Sep 20 14:32:14:283 2019


DpWpDynCreate: created new work process W16-7503

Fri Sep 20 14:33:03:648 2019


DpWpDynCreate: created new work process W20-7777

Fri Sep 20 14:34:03:203 2019


DpWpDynCreate: created new work process W19-8051
DpWpDynCreate: created new work process W18-8052

Fri Sep 20 14:37:15:717 2019


DpHdlDeadWp: W16 (pid=7503) terminated automatically

Fri Sep 20 14:38:04:645 2019


DpHdlDeadWp: W20 (pid=7777) terminated automatically

Fri Sep 20 14:39:05:056 2019


DpHdlDeadWp: W18 (pid=8052) terminated automatically

Fri Sep 20 14:39:05:503 2019


DpHdlDeadWp: W19 (pid=8051) terminated automatically

Fri Sep 20 14:40:03:147 2019


DpWpDynCreate: created new work process W17-10085

Fri Sep 20 14:42:15:841 2019


DpWpDynCreate: created new work process W16-10734

Fri Sep 20 14:45:14:513 2019


DpHdlDeadWp: W17 (pid=10085) terminated automatically

Fri Sep 20 14:47:16:295 2019


DpHdlDeadWp: W16 (pid=10734) terminated automatically

Fri Sep 20 15:02:06:474 2019


DpWpDynCreate: created new work process W20-19583

Fri Sep 20 15:02:20:240 2019


DpWpDynCreate: created new work process W18-20191

Fri Sep 20 15:07:07:711 2019


DpHdlDeadWp: W20 (pid=19583) terminated automatically

Fri Sep 20 15:07:21:652 2019


DpHdlDeadWp: W18 (pid=20191) terminated automatically

Fri Sep 20 15:17:12:116 2019


DpWpDynCreate: created new work process W19-17472

Fri Sep 20 15:19:43:108 2019


DpWpDynCreate: created new work process W17-18353

Fri Sep 20 15:22:15:364 2019


DpHdlDeadWp: W19 (pid=17472) terminated automatically

Fri Sep 20 15:24:58:380 2019


DpWpCheck: dyn W17, pid 18353 no longer needed, terminate now

Fri Sep 20 15:24:58:637 2019


DpHdlDeadWp: W17 (pid=18353) terminated automatically

Fri Sep 20 15:25:12:505 2019


DpWpDynCreate: created new work process W16-20221
DpWpDynCreate: created new work process W20-20222

Fri Sep 20 15:28:17:441 2019


DpWpDynCreate: created new work process W18-21251

Fri Sep 20 15:30:14:075 2019


DpHdlDeadWp: W16 (pid=20221) terminated automatically

Fri Sep 20 15:30:16:993 2019


DpHdlDeadWp: W20 (pid=20222) terminated automatically

Fri Sep 20 15:33:16:404 2019


DpWpDynCreate: created new work process W19-22830

Fri Sep 20 15:33:18:392 2019


DpWpCheck: dyn W18, pid 21251 no longer needed, terminate now

Fri Sep 20 15:33:18:575 2019


DpHdlDeadWp: W18 (pid=21251) terminated automatically

Fri Sep 20 15:38:17:259 2019


DpHdlDeadWp: W19 (pid=22830) terminated automatically

Fri Sep 20 15:43:16:526 2019


DpWpDynCreate: created new work process W17-26558
Fri Sep 20 15:48:18:417 2019
DpWpCheck: dyn W17, pid 26558 no longer needed, terminate now

Fri Sep 20 15:48:18:561 2019


DpHdlDeadWp: W17 (pid=26558) terminated automatically

Fri Sep 20 15:53:16:878 2019


DpWpDynCreate: created new work process W16-29647

Fri Sep 20 15:58:18:436 2019


DpWpCheck: dyn W16, pid 29647 no longer needed, terminate now

Fri Sep 20 15:58:18:979 2019


DpHdlDeadWp: W16 (pid=29647) terminated automatically

Fri Sep 20 16:04:22:303 2019


DpWpDynCreate: created new work process W20-12493

Fri Sep 20 16:09:23:702 2019


DpHdlDeadWp: W20 (pid=12493) terminated automatically

Fri Sep 20 16:16:34:369 2019


DpWpDynCreate: created new work process W18-389

Fri Sep 20 16:21:38:475 2019


DpWpCheck: dyn W18, pid 389 no longer needed, terminate now

Fri Sep 20 16:21:39:434 2019


DpHdlDeadWp: W18 (pid=389) terminated automatically

Fri Sep 20 16:25:25:931 2019


DpWpDynCreate: created new work process W19-3583

Fri Sep 20 16:28:08:945 2019


DpWpDynCreate: created new work process W17-4597

Fri Sep 20 16:30:26:259 2019


DpHdlDeadWp: W19 (pid=3583) terminated automatically

Fri Sep 20 16:33:09:782 2019


DpHdlDeadWp: W17 (pid=4597) terminated automatically

Fri Sep 20 16:38:13:901 2019


DpWpDynCreate: created new work process W16-7968

Fri Sep 20 16:38:14:336 2019


DpWpDynCreate: created new work process W20-7973

Fri Sep 20 16:38:16:143 2019


DpWpDynCreate: created new work process W18-7987

Fri Sep 20 16:43:18:512 2019


DpWpCheck: dyn W16, pid 7968 no longer needed, terminate now
DpWpCheck: dyn W18, pid 7987 no longer needed, terminate now
DpWpCheck: dyn W20, pid 7973 no longer needed, terminate now

Fri Sep 20 16:43:19:579 2019


DpHdlDeadWp: W16 (pid=7968) terminated automatically
DpHdlDeadWp: W18 (pid=7987) terminated automatically
DpHdlDeadWp: W20 (pid=7973) terminated automatically

Fri Sep 20 16:55:15:814 2019


DpWpDynCreate: created new work process W19-13332

Fri Sep 20 17:00:18:536 2019


DpWpCheck: dyn W19, pid 13332 no longer needed, terminate now

Fri Sep 20 17:00:18:706 2019


DpHdlDeadWp: W19 (pid=13332) terminated automatically

Fri Sep 20 17:00:21:201 2019


DpWpDynCreate: created new work process W17-15145

Fri Sep 20 17:05:22:237 2019


DpHdlDeadWp: W17 (pid=15145) terminated automatically

Fri Sep 20 17:15:19:405 2019


DpWpDynCreate: created new work process W16-15326

Fri Sep 20 17:20:21:119 2019


DpHdlDeadWp: W16 (pid=15326) terminated automatically

Fri Sep 20 17:21:06:725 2019


DpHdlDeadWp: W13 (pid=18887) terminated automatically
DpWpDynCreate: created new work process W13-17305

Fri Sep 20 17:22:45:551 2019


DpHdlDeadWp: W9 (pid=32536) terminated automatically
DpWpDynCreate: created new work process W9-17781

Fri Sep 20 17:37:07:453 2019


DpWpDynCreate: created new work process W18-22405

Fri Sep 20 17:42:09:575 2019


DpHdlDeadWp: W18 (pid=22405) terminated automatically

Fri Sep 20 17:42:13:814 2019


DpWpDynCreate: created new work process W20-24302

Fri Sep 20 17:42:15:163 2019


DpWpDynCreate: created new work process W19-24305

Fri Sep 20 17:47:18:627 2019


DpWpCheck: dyn W19, pid 24305 no longer needed, terminate now
DpWpCheck: dyn W20, pid 24302 no longer needed, terminate now

Fri Sep 20 17:47:18:925 2019


DpHdlDeadWp: W19 (pid=24305) terminated automatically
DpHdlDeadWp: W20 (pid=24302) terminated automatically

Fri Sep 20 17:49:03:810 2019


DpWpDynCreate: created new work process W17-26761

Fri Sep 20 17:54:04:357 2019


DpHdlDeadWp: W17 (pid=26761) terminated automatically

Fri Sep 20 18:01:18:981 2019


DpWpDynCreate: created new work process W16-30854

Fri Sep 20 18:06:19:883 2019


DpHdlDeadWp: W16 (pid=30854) terminated automatically

Fri Sep 20 18:26:08:341 2019


DpWpDynCreate: created new work process W18-2332

Fri Sep 20 18:28:13:534 2019


DpWpDynCreate: created new work process W19-3063

Fri Sep 20 18:31:10:727 2019


DpHdlDeadWp: W18 (pid=2332) terminated automatically

Fri Sep 20 18:33:18:697 2019


DpWpCheck: dyn W19, pid 3063 no longer needed, terminate now

Fri Sep 20 18:33:19:241 2019


DpHdlDeadWp: W19 (pid=3063) terminated automatically

Fri Sep 20 18:35:03:886 2019


DpWpDynCreate: created new work process W20-5270

Fri Sep 20 18:38:06:362 2019


DpWpDynCreate: created new work process W17-6314
DpWpDynCreate: created new work process W16-6315

Fri Sep 20 18:40:06:053 2019


DpHdlDeadWp: W20 (pid=5270) terminated automatically

Fri Sep 20 18:43:08:480 2019


DpHdlDeadWp: W16 (pid=6315) terminated automatically

Fri Sep 20 18:43:17:766 2019


DpHdlDeadWp: W17 (pid=6314) terminated automatically

Fri Sep 20 18:47:13:489 2019


DpWpDynCreate: created new work process W18-9325

Fri Sep 20 18:52:16:431 2019


DpHdlDeadWp: W18 (pid=9325) terminated automatically

Fri Sep 20 18:55:26:093 2019


DpWpDynCreate: created new work process W19-11756

Fri Sep 20 19:00:38:741 2019


DpWpCheck: dyn W19, pid 11756 no longer needed, terminate now

Fri Sep 20 19:00:39:457 2019


DpHdlDeadWp: W19 (pid=11756) terminated automatically

Fri Sep 20 19:22:22:621 2019


DpWpDynCreate: created new work process W20-16015

Fri Sep 20 19:23:15:915 2019


DpWpDynCreate: created new work process W16-16281

Fri Sep 20 19:27:23:243 2019


DpHdlDeadWp: W20 (pid=16015) terminated automatically
Fri Sep 20 19:28:18:783 2019
DpWpCheck: dyn W16, pid 16281 no longer needed, terminate now

Fri Sep 20 19:28:19:850 2019


DpHdlDeadWp: W16 (pid=16281) terminated automatically

Fri Sep 20 19:42:07:255 2019


DpWpDynCreate: created new work process W17-22332

Fri Sep 20 19:42:26:188 2019


DpWpDynCreate: created new work process W18-22438

Fri Sep 20 19:47:09:466 2019


DpHdlDeadWp: W17 (pid=22332) terminated automatically

Fri Sep 20 19:47:28:155 2019


DpHdlDeadWp: W18 (pid=22438) terminated automatically

Fri Sep 20 19:57:06:652 2019


DpHdlDeadWp: W9 (pid=17781) terminated automatically
DpWpDynCreate: created new work process W9-27502

Fri Sep 20 19:58:23:763 2019


DpWpDynCreate: created new work process W19-27766

Fri Sep 20 20:00:20:563 2019


DpWpDynCreate: created new work process W20-28486

Fri Sep 20 20:03:38:854 2019


DpWpCheck: dyn W19, pid 27766 no longer needed, terminate now

Fri Sep 20 20:03:39:051 2019


DpHdlDeadWp: W19 (pid=27766) terminated automatically

Fri Sep 20 20:04:34:263 2019


DpWpDynCreate: created new work process W16-5575

Fri Sep 20 20:05:38:856 2019


DpWpCheck: dyn W20, pid 28486 no longer needed, terminate now

Fri Sep 20 20:05:39:242 2019


DpHdlDeadWp: W20 (pid=28486) terminated automatically

Fri Sep 20 20:09:35:409 2019


DpHdlDeadWp: W16 (pid=5575) terminated automatically

Fri Sep 20 20:20:24:432 2019


DpWpDynCreate: created new work process W17-30469

Fri Sep 20 20:25:23:744 2019


DpWpDynCreate: created new work process W18-32071

Fri Sep 20 20:25:25:750 2019


DpHdlDeadWp: W17 (pid=30469) terminated automatically

Fri Sep 20 20:30:24:609 2019


DpHdlDeadWp: W18 (pid=32071) terminated automatically
Fri Sep 20 20:30:24:882 2019
DpWpDynCreate: created new work process W19-1278

Fri Sep 20 20:35:27:236 2019


DpHdlDeadWp: W19 (pid=1278) terminated automatically

Fri Sep 20 20:50:25:931 2019


DpWpDynCreate: created new work process W20-8564

Fri Sep 20 20:55:27:651 2019


DpHdlDeadWp: W20 (pid=8564) terminated automatically

Fri Sep 20 20:56:09:484 2019


DpHdlDeadWp: W11 (pid=18136) terminated automatically
DpWpDynCreate: created new work process W11-10570

Fri Sep 20 21:00:19:686 2019


DpWpDynCreate: created new work process W16-11808

Fri Sep 20 21:05:16:038 2019


DpWpDynCreate: created new work process W17-27479

Fri Sep 20 21:05:20:241 2019


DpHdlDeadWp: W16 (pid=11808) terminated automatically

Fri Sep 20 21:10:17:358 2019


DpHdlDeadWp: W17 (pid=27479) terminated automatically

Fri Sep 20 21:10:18:874 2019


DpWpDynCreate: created new work process W18-10796

Fri Sep 20 21:15:18:399 2019


DpWpDynCreate: created new work process W19-12518

Fri Sep 20 21:15:21:102 2019


DpHdlDeadWp: W18 (pid=10796) terminated automatically

Fri Sep 20 21:16:06:918 2019


DpWpDynCreate: created new work process W20-12805

Fri Sep 20 21:20:19:200 2019


DpHdlDeadWp: W19 (pid=12518) terminated automatically

Fri Sep 20 21:21:18:979 2019


DpWpCheck: dyn W20, pid 12805 no longer needed, terminate now

Fri Sep 20 21:21:21:391 2019


DpHdlDeadWp: W20 (pid=12805) terminated automatically

Fri Sep 20 21:22:05:833 2019


DpWpDynCreate: created new work process W16-14665

Fri Sep 20 21:23:12:249 2019


DpWpDynCreate: created new work process W17-14942

Fri Sep 20 21:23:13:723 2019


DpWpDynCreate: created new work process W18-14948

Fri Sep 20 21:25:13:896 2019


DpWpDynCreate: created new work process W19-15680

Fri Sep 20 21:27:18:988 2019


DpWpCheck: dyn W16, pid 14665 no longer needed, terminate now

Fri Sep 20 21:27:19:595 2019


DpHdlDeadWp: W16 (pid=14665) terminated automatically

Fri Sep 20 21:28:13:961 2019


DpHdlDeadWp: W17 (pid=14942) terminated automatically

Fri Sep 20 21:28:15:099 2019


DpHdlDeadWp: W18 (pid=14948) terminated automatically

Fri Sep 20 21:30:14:489 2019


DpHdlDeadWp: W19 (pid=15680) terminated automatically

Fri Sep 20 21:30:15:615 2019


DpWpDynCreate: created new work process W20-17295

Fri Sep 20 21:30:45:018 2019


DpWpDynCreate: created new work process W16-17632

Fri Sep 20 21:35:19:001 2019


DpWpCheck: dyn W20, pid 17295 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=17295) terminated automatically

Fri Sep 20 21:35:59:002 2019


DpWpCheck: dyn W16, pid 17632 no longer needed, terminate now

Fri Sep 20 21:35:59:423 2019


DpHdlDeadWp: W16 (pid=17632) terminated automatically

Fri Sep 20 21:40:25:890 2019


DpWpDynCreate: created new work process W17-20448

Fri Sep 20 21:42:08:927 2019


DpWpDynCreate: created new work process W18-21218

Fri Sep 20 21:45:27:216 2019


DpHdlDeadWp: W17 (pid=20448) terminated automatically

Fri Sep 20 21:47:09:533 2019


DpHdlDeadWp: W18 (pid=21218) terminated automatically

Fri Sep 20 21:50:22:280 2019


DpWpDynCreate: created new work process W19-23833

Fri Sep 20 21:55:11:696 2019


DpWpDynCreate: created new work process W20-25567

Fri Sep 20 21:55:23:830 2019


DpHdlDeadWp: W19 (pid=23833) terminated automatically

Fri Sep 20 22:00:12:391 2019


DpHdlDeadWp: W20 (pid=25567) terminated automatically

Fri Sep 20 22:05:16:027 2019


DpWpDynCreate: created new work process W16-10947
Fri Sep 20 22:10:17:283 2019
DpHdlDeadWp: W16 (pid=10947) terminated automatically

Fri Sep 20 22:10:26:665 2019


DpWpDynCreate: created new work process W17-26179

Fri Sep 20 22:15:28:038 2019


DpHdlDeadWp: W17 (pid=26179) terminated automatically

Fri Sep 20 22:20:18:759 2019


DpWpDynCreate: created new work process W18-29425

Fri Sep 20 22:25:19:089 2019


DpWpCheck: dyn W18, pid 29425 no longer needed, terminate now

Fri Sep 20 22:25:19:363 2019


DpHdlDeadWp: W18 (pid=29425) terminated automatically

Fri Sep 20 22:25:19:693 2019


DpWpDynCreate: created new work process W19-30941

Fri Sep 20 22:30:21:594 2019


DpHdlDeadWp: W19 (pid=30941) terminated automatically

Fri Sep 20 22:30:24:291 2019


DpWpDynCreate: created new work process W20-307

Fri Sep 20 22:35:39:111 2019


DpWpCheck: dyn W20, pid 307 no longer needed, terminate now

Fri Sep 20 22:35:39:416 2019


DpHdlDeadWp: W20 (pid=307) terminated automatically

Fri Sep 20 22:36:05:411 2019


DpWpDynCreate: created new work process W16-2410
DpWpDynCreate: created new work process W17-2411

Fri Sep 20 22:41:19:121 2019


DpWpCheck: dyn W16, pid 2410 no longer needed, terminate now
DpWpCheck: dyn W17, pid 2411 no longer needed, terminate now

Fri Sep 20 22:41:19:495 2019


DpHdlDeadWp: W16 (pid=2410) terminated automatically
DpHdlDeadWp: W17 (pid=2411) terminated automatically

Fri Sep 20 22:42:06:727 2019


DpWpDynCreate: created new work process W18-4508

Fri Sep 20 22:42:08:298 2019


DpWpDynCreate: created new work process W19-4515

Fri Sep 20 22:47:15:253 2019


DpHdlDeadWp: W18 (pid=4508) terminated automatically

Fri Sep 20 22:47:47:631 2019


DpHdlDeadWp: W19 (pid=4515) terminated automatically

Fri Sep 20 22:55:22:123 2019


DpWpDynCreate: created new work process W20-9131

Fri Sep 20 22:59:22:923 2019


DpWpDynCreate: created new work process W16-10378

Fri Sep 20 23:00:33:591 2019


DpHdlDeadWp: W20 (pid=9131) terminated automatically

Fri Sep 20 23:04:39:157 2019


DpWpCheck: dyn W16, pid 10378 no longer needed, terminate now

Fri Sep 20 23:04:39:895 2019


DpHdlDeadWp: W16 (pid=10378) terminated automatically

Fri Sep 20 23:05:17:282 2019


DpWpDynCreate: created new work process W17-26164

Fri Sep 20 23:09:04:611 2019


DpWpDynCreate: created new work process W18-9016

Fri Sep 20 23:10:19:167 2019


DpWpCheck: dyn W17, pid 26164 no longer needed, terminate now

Fri Sep 20 23:10:19:359 2019


DpHdlDeadWp: W17 (pid=26164) terminated automatically

Fri Sep 20 23:10:20:383 2019


DpWpDynCreate: created new work process W19-9452

Fri Sep 20 23:14:07:357 2019


DpHdlDeadWp: W18 (pid=9016) terminated automatically

Fri Sep 20 23:15:16:324 2019


DpWpDynCreate: created new work process W20-11115

Fri Sep 20 23:15:21:215 2019


DpHdlDeadWp: W19 (pid=9452) terminated automatically

Fri Sep 20 23:20:17:284 2019


DpHdlDeadWp: W20 (pid=11115) terminated automatically

Fri Sep 20 23:20:21:071 2019


DpWpDynCreate: created new work process W16-12603

Fri Sep 20 23:25:19:916 2019


DpWpDynCreate: created new work process W17-14389

Fri Sep 20 23:25:22:349 2019


DpHdlDeadWp: W16 (pid=12603) terminated automatically

Fri Sep 20 23:26:05:131 2019


DpWpDynCreate: created new work process W18-14697

Fri Sep 20 23:30:22:848 2019


DpWpDynCreate: created new work process W19-15918

Fri Sep 20 23:30:29:548 2019


DpHdlDeadWp: W17 (pid=14389) terminated automatically
DpWpDynCreate: created new work process W20-15963
Fri Sep 20 23:31:19:206 2019
DpWpCheck: dyn W18, pid 14697 no longer needed, terminate now

Fri Sep 20 23:31:19:526 2019


DpHdlDeadWp: W18 (pid=14697) terminated automatically

Fri Sep 20 23:35:39:318 2019


DpWpCheck: dyn W19, pid 15918 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=15963) terminated automatically

Fri Sep 20 23:35:40:338 2019


DpHdlDeadWp: W19 (pid=15918) terminated automatically

Fri Sep 20 23:36:12:472 2019


DpWpDynCreate: created new work process W16-17797

Fri Sep 20 23:41:19:325 2019


DpWpCheck: dyn W16, pid 17797 no longer needed, terminate now
DpHdlDeadWp: W16 (pid=17797) terminated automatically

Fri Sep 20 23:41:26:690 2019


DpWpDynCreate: created new work process W17-19343

Fri Sep 20 23:46:26:136 2019


DpWpDynCreate: created new work process W18-21131

Fri Sep 20 23:46:27:218 2019


DpHdlDeadWp: W17 (pid=19343) terminated automatically

Fri Sep 20 23:51:13:865 2019


DpWpDynCreate: created new work process W20-22570

Fri Sep 20 23:51:39:342 2019


DpWpCheck: dyn W18, pid 21131 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=21131) terminated automatically

Fri Sep 20 23:56:15:183 2019


DpHdlDeadWp: W20 (pid=22570) terminated automatically

Fri Sep 20 23:56:17:521 2019


DpWpDynCreate: created new work process W19-24571

Sat Sep 21 00:00:02:599 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 9->8, VB 1->1, ENQ 0->0, BTC 5->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5
DpWpConf: wp reconfiguration, stop W19, pid 24571
DpAdaptWppriv_max_no : 4 -> 4

Sat Sep 21 00:00:02:716 2019


DpHdlDeadWp: W19 (pid=24571) terminated automatically

Sat Sep 21 00:06:08:206 2019


DpWpDynCreate: created new work process W16-9566
DpWpDynCreate: created new work process W17-9567

Sat Sep 21 00:11:18:259 2019


DpHdlDeadWp: W16 (pid=9566) terminated automatically
DpWpCheck: dyn W17, pid 9567 no longer needed, terminate now

Sat Sep 21 00:11:18:601 2019


DpHdlDeadWp: W17 (pid=9567) terminated automatically

Sat Sep 21 00:11:33:972 2019


DpWpDynCreate: created new work process W18-25047

Sat Sep 21 00:11:47:147 2019


DpWpDynCreate: created new work process W20-25180

Sat Sep 21 00:14:03:572 2019


DpWpDynCreate: created new work process W19-26716

Sat Sep 21 00:16:36:143 2019


DpHdlDeadWp: W18 (pid=25047) terminated automatically

Sat Sep 21 00:16:59:388 2019


DpWpCheck: dyn W20, pid 25180 no longer needed, terminate now

Sat Sep 21 00:16:59:822 2019


DpHdlDeadWp: W20 (pid=25180) terminated automatically

Sat Sep 21 00:19:19:394 2019


DpWpCheck: dyn W19, pid 26716 no longer needed, terminate now

Sat Sep 21 00:19:20:382 2019


DpHdlDeadWp: W19 (pid=26716) terminated automatically

Sat Sep 21 00:21:10:210 2019


DpWpDynCreate: created new work process W16-9375

Sat Sep 21 00:21:23:351 2019


DpWpDynCreate: created new work process W17-9606

Sat Sep 21 00:21:24:549 2019


DpWpDynCreate: created new work process W18-9632

Sat Sep 21 00:26:11:631 2019


DpHdlDeadWp: W16 (pid=9375) terminated automatically

Sat Sep 21 00:26:24:681 2019


DpHdlDeadWp: W17 (pid=9606) terminated automatically

Sat Sep 21 00:26:25:309 2019


DpHdlDeadWp: W18 (pid=9632) terminated automatically

Sat Sep 21 00:26:27:445 2019


DpWpDynCreate: created new work process W20-19737
DpWpDynCreate: created new work process W19-19738

Sat Sep 21 00:31:30:143 2019


DpHdlDeadWp: W19 (pid=19738) terminated automatically

Sat Sep 21 00:31:36:078 2019


DpHdlDeadWp: W20 (pid=19737) terminated automatically

Sat Sep 21 00:31:53:728 2019


DpWpDynCreate: created new work process W16-22261
Sat Sep 21 00:36:42:903 2019
DpWpDynCreate: created new work process W17-23989

Sat Sep 21 00:36:59:426 2019


DpWpCheck: dyn W16, pid 22261 no longer needed, terminate now

Sat Sep 21 00:36:59:780 2019


DpHdlDeadWp: W16 (pid=22261) terminated automatically

Sat Sep 21 00:39:16:727 2019


DpWpDynCreate: created new work process W18-24817

Sat Sep 21 00:41:59:437 2019


DpWpCheck: dyn W17, pid 23989 no longer needed, terminate now

Sat Sep 21 00:41:59:661 2019


DpHdlDeadWp: W17 (pid=23989) terminated automatically

Sat Sep 21 00:43:18:990 2019


DpWpDynCreate: created new work process W19-26425

Sat Sep 21 00:44:18:328 2019


DpHdlDeadWp: W18 (pid=24817) terminated automatically

Sat Sep 21 00:48:27:003 2019


DpHdlDeadWp: W19 (pid=26425) terminated automatically

Sat Sep 21 00:51:20:265 2019


DpWpDynCreate: created new work process W20-9382

Sat Sep 21 00:51:24:415 2019


DpWpDynCreate: created new work process W16-9385

Sat Sep 21 00:56:39:462 2019


DpWpCheck: dyn W16, pid 9385 no longer needed, terminate now
DpWpCheck: dyn W20, pid 9382 no longer needed, terminate now

Sat Sep 21 00:56:40:516 2019


DpHdlDeadWp: W16 (pid=9385) terminated automatically
DpHdlDeadWp: W20 (pid=9382) terminated automatically

Sat Sep 21 00:57:07:544 2019


DpWpDynCreate: created new work process W17-11033

Sat Sep 21 01:01:27:561 2019


DpWpDynCreate: created new work process W18-12810

Sat Sep 21 01:02:19:472 2019


DpWpCheck: dyn W17, pid 11033 no longer needed, terminate now

Sat Sep 21 01:02:19:837 2019


DpHdlDeadWp: W17 (pid=11033) terminated automatically

Sat Sep 21 01:02:46:567 2019


DpWpDynCreate: created new work process W19-15789

Sat Sep 21 01:06:39:480 2019


DpWpCheck: dyn W18, pid 12810 no longer needed, terminate now
Sat Sep 21 01:06:40:151 2019
DpHdlDeadWp: W18 (pid=12810) terminated automatically

Sat Sep 21 01:06:41:402 2019


DpHdlDeadWp: W10 (pid=10080) terminated automatically
DpWpDynCreate: created new work process W10-28398

Sat Sep 21 01:06:53:210 2019


DpWpDynCreate: created new work process W16-28935

Sat Sep 21 01:06:53:382 2019


DpWpDynCreate: created new work process W20-28951

Sat Sep 21 01:07:49:190 2019


DpHdlDeadWp: W19 (pid=15789) terminated automatically

Sat Sep 21 01:08:07:914 2019


DpWpDynCreate: created new work process W17-31239

Sat Sep 21 01:08:10:857 2019


DpWpDynCreate: created new work process W18-31246

Sat Sep 21 01:11:54:761 2019


DpHdlDeadWp: W16 (pid=28935) terminated automatically
DpWpCheck: dyn W20, pid 28951 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=28951) terminated automatically

Sat Sep 21 01:13:11:704 2019


DpHdlDeadWp: W17 (pid=31239) terminated automatically
DpWpCheck: dyn W18, pid 31246 no longer needed, terminate now

Sat Sep 21 01:13:11:828 2019


DpHdlDeadWp: W18 (pid=31246) terminated automatically

Sat Sep 21 01:15:03:611 2019


DpWpDynCreate: created new work process W19-12679

Sat Sep 21 01:16:03:533 2019


DpWpDynCreate: created new work process W16-13087

Sat Sep 21 01:20:05:252 2019


DpHdlDeadWp: W19 (pid=12679) terminated automatically

Sat Sep 21 01:21:05:339 2019


DpHdlDeadWp: W16 (pid=13087) terminated automatically
DpWpDynCreate: created new work process W16-14661

Sat Sep 21 01:23:27:376 2019


DpWpDynCreate: created new work process W20-15313

Sat Sep 21 01:23:27:595 2019


DpWpDynCreate: created new work process W17-15314

Sat Sep 21 01:25:03:591 2019


DpWpDynCreate: created new work process W18-15837

Sat Sep 21 01:26:06:391 2019


DpHdlDeadWp: W16 (pid=14661) terminated automatically
Sat Sep 21 01:27:14:999 2019
DpWpDynCreate: created new work process W19-16496

Sat Sep 21 01:28:28:997 2019


DpWpCheck: dyn W17, pid 15314 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=15313) terminated automatically

Sat Sep 21 01:28:29:414 2019


DpHdlDeadWp: W17 (pid=15314) terminated automatically

Sat Sep 21 01:30:04:224 2019


DpHdlDeadWp: W18 (pid=15837) terminated automatically

Sat Sep 21 01:31:06:685 2019


DpWpDynCreate: created new work process W16-17882

Sat Sep 21 01:31:15:996 2019


DpWpDynCreate: created new work process W20-17889

Sat Sep 21 01:32:19:531 2019


DpWpCheck: dyn W19, pid 16496 no longer needed, terminate now

Sat Sep 21 01:32:19:677 2019


DpHdlDeadWp: W19 (pid=16496) terminated automatically

Sat Sep 21 01:32:27:161 2019


DpWpDynCreate: created new work process W17-18261

Sat Sep 21 01:36:07:408 2019


DpHdlDeadWp: W16 (pid=17882) terminated automatically

Sat Sep 21 01:36:19:539 2019


DpWpCheck: dyn W20, pid 17889 no longer needed, terminate now

Sat Sep 21 01:36:20:504 2019


DpHdlDeadWp: W20 (pid=17889) terminated automatically

Sat Sep 21 01:37:10:852 2019


DpWpDynCreate: created new work process W18-19727

Sat Sep 21 01:37:28:651 2019


DpHdlDeadWp: W17 (pid=18261) terminated automatically

Sat Sep 21 01:38:25:222 2019


DpWpDynCreate: created new work process W19-20141

Sat Sep 21 01:40:42:124 2019


DpHdlDeadWp: W9 (pid=27502) terminated automatically
DpWpDynCreate: created new work process W9-21181

Sat Sep 21 01:42:11:544 2019


DpHdlDeadWp: W18 (pid=19727) terminated automatically

Sat Sep 21 01:42:13:153 2019


DpWpDynCreate: created new work process W16-21464

Sat Sep 21 01:42:21:978 2019


DpWpDynCreate: created new work process W20-21469
Sat Sep 21 01:43:27:270 2019
DpHdlDeadWp: W19 (pid=20141) terminated automatically

Sat Sep 21 01:47:19:558 2019


DpWpCheck: dyn W16, pid 21464 no longer needed, terminate now

Sat Sep 21 01:47:20:064 2019


DpHdlDeadWp: W16 (pid=21464) terminated automatically

Sat Sep 21 01:47:30:489 2019


DpHdlDeadWp: W20 (pid=21469) terminated automatically

Sat Sep 21 01:51:09:794 2019


DpWpDynCreate: created new work process W17-24668

Sat Sep 21 01:56:10:953 2019


DpHdlDeadWp: W17 (pid=24668) terminated automatically

Sat Sep 21 01:56:11:025 2019


DpWpDynCreate: created new work process W18-26240

Sat Sep 21 01:58:10:537 2019


DpWpDynCreate: created new work process W19-27029

Sat Sep 21 02:01:19:581 2019


DpWpCheck: dyn W18, pid 26240 no longer needed, terminate now

Sat Sep 21 02:01:19:768 2019


DpHdlDeadWp: W18 (pid=26240) terminated automatically

Sat Sep 21 02:03:17:709 2019


DpHdlDeadWp: W19 (pid=27029) terminated automatically

Sat Sep 21 02:03:23:045 2019


DpWpDynCreate: created new work process W16-32050

Sat Sep 21 02:07:07:556 2019


DpWpDynCreate: created new work process W20-7860

Sat Sep 21 02:08:39:604 2019


DpWpCheck: dyn W16, pid 32050 no longer needed, terminate now

Sat Sep 21 02:08:39:854 2019


DpHdlDeadWp: W16 (pid=32050) terminated automatically

Sat Sep 21 02:10:03:598 2019


DpWpDynCreate: created new work process W17-13293

Sat Sep 21 02:12:13:726 2019


DpHdlDeadWp: W20 (pid=7860) terminated automatically

Sat Sep 21 02:15:04:911 2019


DpHdlDeadWp: W17 (pid=13293) terminated automatically

Sat Sep 21 02:17:00:460 2019


DpWpDynCreate: created new work process W18-28666

Sat Sep 21 02:18:03:683 2019


DpWpDynCreate: created new work process W19-28921

Sat Sep 21 02:18:03:805 2019


DpWpDynCreate: created new work process W16-28922

Sat Sep 21 02:22:04:325 2019


DpHdlDeadWp: W18 (pid=28666) terminated automatically

Sat Sep 21 02:23:04:641 2019


DpHdlDeadWp: W16 (pid=28922) terminated automatically
DpWpCheck: dyn W19, pid 28921 no longer needed, terminate now

Sat Sep 21 02:23:05:675 2019


DpHdlDeadWp: W19 (pid=28921) terminated automatically
DpWpDynCreate: created new work process W20-30437

Sat Sep 21 02:28:07:010 2019


DpHdlDeadWp: W20 (pid=30437) terminated automatically

Sat Sep 21 02:28:50:319 2019


DpWpDynCreate: created new work process W17-32384

Sat Sep 21 02:30:11:340 2019


DpWpDynCreate: created new work process W18-405

Sat Sep 21 02:31:06:685 2019


DpWpDynCreate: created new work process W16-743

Sat Sep 21 02:33:59:640 2019


DpWpCheck: dyn W17, pid 32384 no longer needed, terminate now

Sat Sep 21 02:34:00:148 2019


DpHdlDeadWp: W17 (pid=32384) terminated automatically

Sat Sep 21 02:35:13:419 2019


DpHdlDeadWp: W18 (pid=405) terminated automatically

Sat Sep 21 02:36:05:722 2019


DpWpDynCreate: created new work process W19-2756
DpWpDynCreate: created new work process W20-2757

Sat Sep 21 02:36:07:791 2019


DpHdlDeadWp: W16 (pid=743) terminated automatically

Sat Sep 21 02:41:07:013 2019


DpHdlDeadWp: W19 (pid=2756) terminated automatically
DpHdlDeadWp: W20 (pid=2757) terminated automatically

Sat Sep 21 02:42:19:564 2019


DpWpDynCreate: created new work process W17-4664

Sat Sep 21 02:43:37:652 2019


DpWpDynCreate: created new work process W18-5064

Sat Sep 21 02:47:21:317 2019


DpHdlDeadWp: W17 (pid=4664) terminated automatically

Sat Sep 21 02:47:53:878 2019


DpWpDynCreate: created new work process W16-6763
Sat Sep 21 02:48:39:667 2019
DpWpCheck: dyn W18, pid 5064 no longer needed, terminate now

Sat Sep 21 02:48:40:303 2019


DpHdlDeadWp: W18 (pid=5064) terminated automatically

Sat Sep 21 02:49:07:294 2019


DpWpDynCreate: created new work process W19-7046

Sat Sep 21 02:52:44:517 2019


DpWpDynCreate: created new work process W20-8380

Sat Sep 21 02:52:55:385 2019


DpHdlDeadWp: W16 (pid=6763) terminated automatically

Sat Sep 21 02:54:19:674 2019


DpWpCheck: dyn W19, pid 7046 no longer needed, terminate now

Sat Sep 21 02:54:20:363 2019


DpHdlDeadWp: W19 (pid=7046) terminated automatically

Sat Sep 21 02:57:09:937 2019


DpWpDynCreate: created new work process W17-9751

Sat Sep 21 02:57:59:680 2019


DpWpCheck: dyn W20, pid 8380 no longer needed, terminate now

Sat Sep 21 02:58:02:785 2019


DpHdlDeadWp: W20 (pid=8380) terminated automatically

Sat Sep 21 02:58:26:593 2019


DpWpDynCreate: created new work process W18-10011

Sat Sep 21 03:02:13:089 2019


DpHdlDeadWp: W17 (pid=9751) terminated automatically

Sat Sep 21 03:02:14:682 2019


DpWpDynCreate: created new work process W16-13954

Sat Sep 21 03:03:17:496 2019


DpWpDynCreate: created new work process W19-18105

Sat Sep 21 03:03:39:689 2019


DpWpCheck: dyn W18, pid 10011 no longer needed, terminate now

Sat Sep 21 03:03:40:270 2019


DpHdlDeadWp: W18 (pid=10011) terminated automatically

Sat Sep 21 03:07:19:695 2019


DpWpCheck: dyn W16, pid 13954 no longer needed, terminate now

Sat Sep 21 03:07:20:507 2019


DpHdlDeadWp: W16 (pid=13954) terminated automatically

Sat Sep 21 03:08:19:697 2019


DpWpCheck: dyn W19, pid 18105 no longer needed, terminate now

Sat Sep 21 03:08:20:519 2019


DpHdlDeadWp: W19 (pid=18105) terminated automatically

Sat Sep 21 03:09:05:816 2019


DpWpDynCreate: created new work process W20-6837

Sat Sep 21 03:14:06:973 2019


DpHdlDeadWp: W20 (pid=6837) terminated automatically

Sat Sep 21 03:14:16:862 2019


DpWpDynCreate: created new work process W17-10996

Sat Sep 21 03:14:24:282 2019


DpWpDynCreate: created new work process W18-11049

Sat Sep 21 03:15:39:707 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T13_U8713 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T13_U8713_M0 |001|SOLMAN_ADMIN|SST-LAP-HP0055 |02:34:03|18 |
SAPLSMTR_NAVIGATION |high| |
| |
DpHdlSoftCancel: cancel request for T13_U8713_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Sat Sep 21 03:19:17:992 2019


DpHdlDeadWp: W17 (pid=10996) terminated automatically
DpWpDynCreate: created new work process W16-12513

Sat Sep 21 03:19:28:287 2019


DpHdlDeadWp: W18 (pid=11049) terminated automatically

Sat Sep 21 03:20:05:005 2019


DpWpDynCreate: created new work process W19-12774

Sat Sep 21 03:24:20:608 2019


DpHdlDeadWp: W16 (pid=12513) terminated automatically

Sat Sep 21 03:24:25:530 2019


DpWpDynCreate: created new work process W20-14378

Sat Sep 21 03:25:19:724 2019


DpWpCheck: dyn W19, pid 12774 no longer needed, terminate now

Sat Sep 21 03:25:20:709 2019


DpHdlDeadWp: W19 (pid=12774) terminated automatically

Sat Sep 21 03:26:05:455 2019


DpWpDynCreate: created new work process W17-14874

Sat Sep 21 03:27:07:854 2019


DpWpDynCreate: created new work process W18-15242

Sat Sep 21 03:29:28:332 2019


DpHdlDeadWp: W20 (pid=14378) terminated automatically

Sat Sep 21 03:31:06:909 2019


DpHdlDeadWp: W17 (pid=14874) terminated automatically
DpWpDynCreate: created new work process W16-16519

Sat Sep 21 03:32:08:441 2019


DpHdlDeadWp: W18 (pid=15242) terminated automatically

Sat Sep 21 03:33:23:142 2019


DpWpDynCreate: created new work process W19-17321

Sat Sep 21 03:36:07:934 2019


DpHdlDeadWp: W16 (pid=16519) terminated automatically

Sat Sep 21 03:36:09:333 2019


DpWpDynCreate: created new work process W20-18076

Sat Sep 21 03:36:23:575 2019


DpWpDynCreate: created new work process W17-18234

Sat Sep 21 03:38:39:745 2019


DpWpCheck: dyn W19, pid 17321 no longer needed, terminate now

Sat Sep 21 03:38:41:331 2019


DpHdlDeadWp: W19 (pid=17321) terminated automatically

Sat Sep 21 03:39:10:009 2019


DpWpDynCreate: created new work process W18-18988

Sat Sep 21 03:41:25:123 2019


DpWpCheck: dyn W17, pid 18234 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=18076) terminated automatically

Sat Sep 21 03:41:26:212 2019


DpHdlDeadWp: W17 (pid=18234) terminated automatically

Sat Sep 21 03:43:19:344 2019


DpWpDynCreate: created new work process W16-20435

Sat Sep 21 03:43:38:730 2019


DpWpDynCreate: created new work process W19-20798

Sat Sep 21 03:44:19:752 2019


DpWpCheck: dyn W18, pid 18988 no longer needed, terminate now

Sat Sep 21 03:44:20:354 2019


DpHdlDeadWp: W18 (pid=18988) terminated automatically

Sat Sep 21 03:45:16:939 2019


DpWpDynCreate: created new work process W20-21274

Sat Sep 21 03:45:23:327 2019


DpWpDynCreate: created new work process W17-21279

Sat Sep 21 03:48:39:760 2019


DpWpCheck: dyn W16, pid 20435 no longer needed, terminate now
DpWpCheck: dyn W19, pid 20798 no longer needed, terminate now

Sat Sep 21 03:48:40:658 2019


DpHdlDeadWp: W16 (pid=20435) terminated automatically
DpHdlDeadWp: W19 (pid=20798) terminated automatically

Sat Sep 21 03:48:47:039 2019


DpWpDynCreate: created new work process W18-22436
Sat Sep 21 03:50:04:469 2019
DpWpDynCreate: created new work process W16-22763

Sat Sep 21 03:50:19:802 2019


DpHdlDeadWp: W20 (pid=21274) terminated automatically

Sat Sep 21 03:50:24:901 2019


DpHdlDeadWp: W17 (pid=21279) terminated automatically

Sat Sep 21 03:53:17:987 2019


DpWpDynCreate: created new work process W19-23882

Sat Sep 21 03:53:59:806 2019


DpWpCheck: dyn W18, pid 22436 no longer needed, terminate now

Sat Sep 21 03:54:00:078 2019


DpHdlDeadWp: W18 (pid=22436) terminated automatically

Sat Sep 21 03:55:14:420 2019


DpWpDynCreate: created new work process W20-24659

Sat Sep 21 03:55:15:998 2019


DpHdlDeadWp: W16 (pid=22763) terminated automatically

Sat Sep 21 03:58:28:956 2019


DpHdlDeadWp: W19 (pid=23882) terminated automatically

Sat Sep 21 04:00:12:961 2019


DpWpDynCreate: created new work process W17-26322

Sat Sep 21 04:00:16:489 2019


DpHdlDeadWp: W20 (pid=24659) terminated automatically

Sat Sep 21 04:00:20:802 2019


DpWpDynCreate: created new work process W18-26425

Sat Sep 21 04:05:05:864 2019


DpWpDynCreate: created new work process W16-9189

Sat Sep 21 04:05:13:229 2019


DpHdlDeadWp: W17 (pid=26322) terminated automatically

Sat Sep 21 04:05:17:247 2019


DpWpDynCreate: created new work process W19-9744

Sat Sep 21 04:05:39:823 2019


DpWpCheck: dyn W18, pid 26425 no longer needed, terminate now

Sat Sep 21 04:05:40:359 2019


DpHdlDeadWp: W18 (pid=26425) terminated automatically

Sat Sep 21 04:07:26:960 2019


DpWpDynCreate: created new work process W20-13783

Sat Sep 21 04:10:06:295 2019


DpHdlDeadWp: W16 (pid=9189) terminated automatically
DpWpDynCreate: created new work process W17-24977

Sat Sep 21 04:10:19:831 2019


DpWpCheck: dyn W19, pid 9744 no longer needed, terminate now

Sat Sep 21 04:10:20:081 2019


DpHdlDeadWp: W19 (pid=9744) terminated automatically

Sat Sep 21 04:12:07:516 2019


DpWpDynCreate: created new work process W18-25812

Sat Sep 21 04:12:39:835 2019


DpWpCheck: dyn W20, pid 13783 no longer needed, terminate now

Sat Sep 21 04:12:40:583 2019


DpHdlDeadWp: W20 (pid=13783) terminated automatically

Sat Sep 21 04:15:12:372 2019


DpHdlDeadWp: W17 (pid=24977) terminated automatically

Sat Sep 21 04:15:16:519 2019


DpWpDynCreate: created new work process W16-26772

Sat Sep 21 04:17:19:841 2019


DpWpCheck: dyn W18, pid 25812 no longer needed, terminate now

Sat Sep 21 04:17:20:938 2019


DpHdlDeadWp: W18 (pid=25812) terminated automatically

Sat Sep 21 04:18:07:013 2019


DpWpDynCreate: created new work process W19-27565

Sat Sep 21 04:20:19:382 2019


DpHdlDeadWp: W16 (pid=26772) terminated automatically

Sat Sep 21 04:22:20:015 2019


DpWpDynCreate: created new work process W20-29065

Sat Sep 21 04:23:08:264 2019


DpHdlDeadWp: W19 (pid=27565) terminated automatically
DpWpDynCreate: created new work process W19-29351

Sat Sep 21 04:24:08:070 2019


DpWpDynCreate: created new work process W17-29645

Sat Sep 21 04:27:39:859 2019


DpWpCheck: dyn W20, pid 29065 no longer needed, terminate now

Sat Sep 21 04:27:40:543 2019


DpHdlDeadWp: W20 (pid=29065) terminated automatically

Sat Sep 21 04:28:10:039 2019


DpHdlDeadWp: W19 (pid=29351) terminated automatically
DpWpDynCreate: created new work process W19-30866

Sat Sep 21 04:28:24:170 2019


DpWpDynCreate: created new work process W18-30873

Sat Sep 21 04:29:09:974 2019


DpHdlDeadWp: W17 (pid=29645) terminated automatically

Sat Sep 21 04:29:56:088 2019


DpWpDynCreate: created new work process W16-31597

Sat Sep 21 04:33:12:109 2019


DpHdlDeadWp: W19 (pid=30866) terminated automatically

Sat Sep 21 04:33:26:330 2019


DpHdlDeadWp: W18 (pid=30873) terminated automatically

Sat Sep 21 04:34:59:903 2019


DpHdlDeadWp: W16 (pid=31597) terminated automatically

Sat Sep 21 04:37:20:906 2019


DpWpDynCreate: created new work process W20-1434

Sat Sep 21 04:42:39:916 2019


DpWpCheck: dyn W20, pid 1434 no longer needed, terminate now

Sat Sep 21 04:42:40:849 2019


DpHdlDeadWp: W20 (pid=1434) terminated automatically

Sat Sep 21 04:43:42:381 2019


DpWpDynCreate: created new work process W17-4021

Sat Sep 21 04:45:19:564 2019


DpWpDynCreate: created new work process W19-4440

Sat Sep 21 04:45:25:193 2019


DpWpDynCreate: created new work process W18-4478
DpWpDynCreate: created new work process W16-4479

Sat Sep 21 04:48:51:640 2019


DpHdlDeadWp: W17 (pid=4021) terminated automatically

Sat Sep 21 04:50:28:292 2019


DpWpCheck: dyn W16, pid 4479 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=4478) terminated automatically
DpWpCheck: dyn W19, pid 4440 no longer needed, terminate now

Sat Sep 21 04:50:29:300 2019


DpHdlDeadWp: W16 (pid=4479) terminated automatically
DpHdlDeadWp: W19 (pid=4440) terminated automatically

Sat Sep 21 04:53:09:514 2019


DpWpDynCreate: created new work process W20-7155

Sat Sep 21 04:55:14:724 2019


DpWpDynCreate: created new work process W17-7766

Sat Sep 21 04:58:28:823 2019


DpHdlDeadWp: W20 (pid=7155) terminated automatically

Sat Sep 21 05:00:14:388 2019


DpWpDynCreate: created new work process W18-9628

Sat Sep 21 05:00:15:889 2019


DpHdlDeadWp: W17 (pid=7766) terminated automatically

Sat Sep 21 05:05:06:850 2019


DpWpDynCreate: created new work process W16-24820
Sat Sep 21 05:05:17:133 2019
DpHdlDeadWp: W18 (pid=9628) terminated automatically
DpWpDynCreate: created new work process W19-25274

Sat Sep 21 05:09:03:798 2019


DpWpDynCreate: created new work process W20-8008

Sat Sep 21 05:10:11:846 2019


DpHdlDeadWp: W16 (pid=24820) terminated automatically

Sat Sep 21 05:10:18:430 2019


DpHdlDeadWp: W19 (pid=25274) terminated automatically

Sat Sep 21 05:14:06:738 2019


DpHdlDeadWp: W20 (pid=8008) terminated automatically

Sat Sep 21 05:15:06:957 2019


DpWpDynCreate: created new work process W17-9869

Sat Sep 21 05:18:06:177 2019


DpWpDynCreate: created new work process W18-10995

Sat Sep 21 05:20:07:489 2019


DpHdlDeadWp: W17 (pid=9869) terminated automatically

Sat Sep 21 05:20:12:633 2019


DpWpDynCreate: created new work process W16-11632

Sat Sep 21 05:23:08:542 2019


DpHdlDeadWp: W18 (pid=10995) terminated automatically

Sat Sep 21 05:23:10:422 2019


DpWpDynCreate: created new work process W19-12520

Sat Sep 21 05:25:13:131 2019


DpHdlDeadWp: W16 (pid=11632) terminated automatically

Sat Sep 21 05:25:14:304 2019


DpWpDynCreate: created new work process W20-13192

Sat Sep 21 05:28:11:670 2019


DpWpDynCreate: created new work process W17-14225
DpHdlDeadWp: W19 (pid=12520) terminated automatically
DpWpDynCreate: created new work process W18-14226

Sat Sep 21 05:29:57:162 2019


DpWpDynCreate: created new work process W16-14898

Sat Sep 21 05:30:20:002 2019


DpWpCheck: dyn W20, pid 13192 no longer needed, terminate now

Sat Sep 21 05:30:20:680 2019


DpHdlDeadWp: W20 (pid=13192) terminated automatically

Sat Sep 21 05:33:12:980 2019


DpWpCheck: dyn W17, pid 14225 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=14226) terminated automatically
Sat Sep 21 05:33:13:883 2019
DpHdlDeadWp: W17 (pid=14225) terminated automatically

Sat Sep 21 05:33:19:285 2019


DpWpDynCreate: created new work process W19-15834

Sat Sep 21 05:34:59:326 2019


DpHdlDeadWp: W16 (pid=14898) terminated automatically

Sat Sep 21 05:37:06:602 2019


DpWpDynCreate: created new work process W20-17015

Sat Sep 21 05:38:20:018 2019


DpWpCheck: dyn W19, pid 15834 no longer needed, terminate now

Sat Sep 21 05:38:20:568 2019


DpHdlDeadWp: W19 (pid=15834) terminated automatically

Sat Sep 21 05:40:08:020 2019


DpWpDynCreate: created new work process W18-17915

Sat Sep 21 05:40:13:085 2019


DpWpDynCreate: created new work process W17-17918

Sat Sep 21 05:42:20:029 2019


DpWpCheck: dyn W20, pid 17015 no longer needed, terminate now

Sat Sep 21 05:42:20:831 2019


DpHdlDeadWp: W20 (pid=17015) terminated automatically

Sat Sep 21 05:43:07:484 2019


DpWpDynCreate: created new work process W16-18782

Sat Sep 21 05:44:52:045 2019


DpWpDynCreate: created new work process W19-19514

Sat Sep 21 05:45:10:084 2019


DpHdlDeadWp: W18 (pid=17915) terminated automatically

Sat Sep 21 05:45:20:032 2019


DpWpCheck: dyn W17, pid 17918 no longer needed, terminate now

Sat Sep 21 05:45:21:123 2019


DpHdlDeadWp: W17 (pid=17918) terminated automatically

Sat Sep 21 05:48:09:431 2019


DpHdlDeadWp: W16 (pid=18782) terminated automatically

Sat Sep 21 05:48:09:535 2019


DpWpDynCreate: created new work process W20-20573

Sat Sep 21 05:50:00:041 2019


DpWpCheck: dyn W19, pid 19514 no longer needed, terminate now

Sat Sep 21 05:50:00:359 2019


DpHdlDeadWp: W19 (pid=19514) terminated automatically

Sat Sep 21 05:50:04:539 2019


DpWpDynCreate: created new work process W18-21259
Sat Sep 21 05:53:24:756 2019
DpHdlDeadWp: W20 (pid=20573) terminated automatically

Sat Sep 21 05:55:05:898 2019


DpWpDynCreate: created new work process W17-22863

Sat Sep 21 05:55:16:763 2019


DpWpDynCreate: created new work process W16-22951

Sat Sep 21 05:55:20:051 2019


DpWpCheck: dyn W18, pid 21259 no longer needed, terminate now

Sat Sep 21 05:55:20:152 2019


DpHdlDeadWp: W18 (pid=21259) terminated automatically

Sat Sep 21 05:55:23:055 2019


DpWpDynCreate: created new work process W19-23064

Sat Sep 21 06:00:07:287 2019


DpHdlDeadWp: W17 (pid=22863) terminated automatically

Sat Sep 21 06:00:17:769 2019


DpHdlDeadWp: W16 (pid=22951) terminated automatically

Sat Sep 21 06:00:24:569 2019


DpWpDynCreate: created new work process W20-25284

Sat Sep 21 06:00:40:061 2019


DpWpCheck: dyn W19, pid 23064 no longer needed, terminate now

Sat Sep 21 06:00:40:352 2019


DpHdlDeadWp: W19 (pid=23064) terminated automatically

Sat Sep 21 06:03:53:855 2019


DpWpDynCreate: created new work process W18-3892

Sat Sep 21 06:05:19:053 2019


DpWpDynCreate: created new work process W17-8583

Sat Sep 21 06:05:27:702 2019


DpHdlDeadWp: W20 (pid=25284) terminated automatically

Sat Sep 21 06:09:00:082 2019


DpWpCheck: dyn W18, pid 3892 no longer needed, terminate now

Sat Sep 21 06:09:00:345 2019


DpHdlDeadWp: W18 (pid=3892) terminated automatically

Sat Sep 21 06:10:20:083 2019


DpWpCheck: dyn W17, pid 8583 no longer needed, terminate now

Sat Sep 21 06:10:20:216 2019


DpHdlDeadWp: W17 (pid=8583) terminated automatically

Sat Sep 21 06:10:28:002 2019


DpWpDynCreate: created new work process W16-23559
DpWpDynCreate: created new work process W19-23560
Sat Sep 21 06:12:14:051 2019
DpWpDynCreate: created new work process W20-24378

Sat Sep 21 06:13:28:158 2019


DpWpDynCreate: created new work process W18-25147

Sat Sep 21 06:15:40:090 2019


DpWpCheck: dyn W16, pid 23559 no longer needed, terminate now
DpWpCheck: dyn W19, pid 23560 no longer needed, terminate now

Sat Sep 21 06:15:41:223 2019


DpHdlDeadWp: W16 (pid=23559) terminated automatically
DpHdlDeadWp: W19 (pid=23560) terminated automatically

Sat Sep 21 06:17:20:092 2019


DpWpCheck: dyn W20, pid 24378 no longer needed, terminate now

Sat Sep 21 06:17:20:307 2019


DpHdlDeadWp: W20 (pid=24378) terminated automatically

Sat Sep 21 06:18:28:099 2019


DpWpDynCreate: created new work process W17-26699

Sat Sep 21 06:18:40:094 2019


DpWpCheck: dyn W18, pid 25147 no longer needed, terminate now

Sat Sep 21 06:18:41:865 2019


DpHdlDeadWp: W18 (pid=25147) terminated automatically

Sat Sep 21 06:20:27:420 2019


DpWpDynCreate: created new work process W16-27272

Sat Sep 21 06:23:21:353 2019


DpWpDynCreate: created new work process W19-28397

Sat Sep 21 06:23:40:102 2019


DpWpCheck: dyn W17, pid 26699 no longer needed, terminate now

Sat Sep 21 06:23:41:213 2019


DpHdlDeadWp: W17 (pid=26699) terminated automatically

Sat Sep 21 06:25:40:106 2019


DpWpCheck: dyn W16, pid 27272 no longer needed, terminate now

Sat Sep 21 06:25:40:373 2019


DpHdlDeadWp: W16 (pid=27272) terminated automatically

Sat Sep 21 06:27:11:608 2019


DpWpDynCreate: created new work process W20-29598

Sat Sep 21 06:28:40:112 2019


DpWpCheck: dyn W19, pid 28397 no longer needed, terminate now

Sat Sep 21 06:28:40:626 2019


DpHdlDeadWp: W19 (pid=28397) terminated automatically

Sat Sep 21 06:31:04:793 2019


DpWpDynCreate: created new work process W18-30795
Sat Sep 21 06:32:12:887 2019
DpHdlDeadWp: W20 (pid=29598) terminated automatically

Sat Sep 21 06:36:04:956 2019


DpWpDynCreate: created new work process W17-32520

Sat Sep 21 06:36:05:544 2019


DpHdlDeadWp: W18 (pid=30795) terminated automatically

Sat Sep 21 06:38:16:590 2019


DpWpDynCreate: created new work process W16-727

Sat Sep 21 06:41:10:775 2019


DpHdlDeadWp: W17 (pid=32520) terminated automatically

Sat Sep 21 06:41:23:732 2019


DpWpDynCreate: created new work process W19-1779

Sat Sep 21 06:43:17:544 2019


DpHdlDeadWp: W16 (pid=727) terminated automatically

Sat Sep 21 06:44:20:632 2019


DpWpDynCreate: created new work process W20-2955

Sat Sep 21 06:46:40:147 2019


DpWpCheck: dyn W19, pid 1779 no longer needed, terminate now

Sat Sep 21 06:46:40:760 2019


DpHdlDeadWp: W19 (pid=1779) terminated automatically

Sat Sep 21 06:47:27:707 2019


DpWpDynCreate: created new work process W18-3950

Sat Sep 21 06:47:39:157 2019


DpWpDynCreate: created new work process W17-4090

Sat Sep 21 06:49:28:379 2019


DpHdlDeadWp: W20 (pid=2955) terminated automatically

Sat Sep 21 06:52:40:158 2019


DpWpCheck: dyn W17, pid 4090 no longer needed, terminate now
DpWpCheck: dyn W18, pid 3950 no longer needed, terminate now

Sat Sep 21 06:52:40:582 2019


DpHdlDeadWp: W18 (pid=3950) terminated automatically

Sat Sep 21 06:52:42:320 2019


DpHdlDeadWp: W17 (pid=4090) terminated automatically

Sat Sep 21 06:57:05:951 2019


DpWpDynCreate: created new work process W16-7138

Sat Sep 21 06:57:09:721 2019


DpWpDynCreate: created new work process W19-7162

Sat Sep 21 07:02:07:956 2019


DpHdlDeadWp: W16 (pid=7138) terminated automatically

Sat Sep 21 07:02:18:768 2019


DpHdlDeadWp: W19 (pid=7162) terminated automatically

Sat Sep 21 07:08:06:361 2019


DpWpDynCreate: created new work process W20-582

Sat Sep 21 07:08:14:619 2019


DpWpDynCreate: created new work process W18-1301

Sat Sep 21 07:10:04:547 2019


DpWpDynCreate: created new work process W17-6729

Sat Sep 21 07:13:09:150 2019


DpHdlDeadWp: W20 (pid=582) terminated automatically

Sat Sep 21 07:13:20:191 2019


DpWpCheck: dyn W18, pid 1301 no longer needed, terminate now

Sat Sep 21 07:13:20:480 2019


DpHdlDeadWp: W18 (pid=1301) terminated automatically

Sat Sep 21 07:13:34:210 2019


DpWpDynCreate: created new work process W16-8179

Sat Sep 21 07:15:05:785 2019


DpWpDynCreate: created new work process W19-8659
DpHdlDeadWp: W17 (pid=6729) terminated automatically

Sat Sep 21 07:18:35:859 2019


DpHdlDeadWp: W16 (pid=8179) terminated automatically

Sat Sep 21 07:19:12:210 2019


DpWpDynCreate: created new work process W20-10021

Sat Sep 21 07:20:09:524 2019


DpWpDynCreate: created new work process W18-10404

Sat Sep 21 07:20:20:200 2019


DpWpCheck: dyn W19, pid 8659 no longer needed, terminate now

Sat Sep 21 07:20:21:071 2019


DpHdlDeadWp: W19 (pid=8659) terminated automatically

Sat Sep 21 07:23:10:789 2019


DpWpDynCreate: created new work process W17-11403

Sat Sep 21 07:24:19:443 2019


DpHdlDeadWp: W20 (pid=10021) terminated automatically

Sat Sep 21 07:25:11:171 2019


DpHdlDeadWp: W18 (pid=10404) terminated automatically

Sat Sep 21 07:27:20:624 2019


DpWpDynCreate: created new work process W16-12638

Sat Sep 21 07:28:20:216 2019


DpWpCheck: dyn W17, pid 11403 no longer needed, terminate now

Sat Sep 21 07:28:20:786 2019


DpHdlDeadWp: W17 (pid=11403) terminated automatically
Sat Sep 21 07:29:05:327 2019
DpWpDynCreate: created new work process W19-13174

Sat Sep 21 07:29:24:664 2019


DpWpDynCreate: created new work process W20-13286

Sat Sep 21 07:31:09:635 2019


DpWpDynCreate: created new work process W18-14021

Sat Sep 21 07:32:40:225 2019


DpWpCheck: dyn W16, pid 12638 no longer needed, terminate now

Sat Sep 21 07:32:41:147 2019


DpHdlDeadWp: W16 (pid=12638) terminated automatically

Sat Sep 21 07:34:20:229 2019


DpWpCheck: dyn W19, pid 13174 no longer needed, terminate now

Sat Sep 21 07:34:21:225 2019


DpHdlDeadWp: W19 (pid=13174) terminated automatically

Sat Sep 21 07:34:40:231 2019


DpWpCheck: dyn W20, pid 13286 no longer needed, terminate now

Sat Sep 21 07:34:41:306 2019


DpHdlDeadWp: W20 (pid=13286) terminated automatically

Sat Sep 21 07:36:11:768 2019


DpHdlDeadWp: W18 (pid=14021) terminated automatically

Sat Sep 21 07:37:13:102 2019


DpWpDynCreate: created new work process W17-15806

Sat Sep 21 07:37:30:377 2019


DpWpDynCreate: created new work process W16-15960

Sat Sep 21 07:37:41:148 2019


DpWpDynCreate: created new work process W19-16063

Sat Sep 21 07:37:41:354 2019


DpWpDynCreate: created new work process W20-16064

Sat Sep 21 07:42:17:032 2019


DpHdlDeadWp: W17 (pid=15806) terminated automatically

Sat Sep 21 07:42:40:244 2019


DpWpCheck: dyn W16, pid 15960 no longer needed, terminate now

Sat Sep 21 07:42:41:427 2019


DpHdlDeadWp: W16 (pid=15960) terminated automatically

Sat Sep 21 07:42:43:883 2019


DpWpCheck: dyn W19, pid 16063 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=16064) terminated automatically

Sat Sep 21 07:42:44:972 2019


DpHdlDeadWp: W19 (pid=16063) terminated automatically
Sat Sep 21 07:43:04:694 2019
DpWpDynCreate: created new work process W18-17658

Sat Sep 21 07:44:29:582 2019


DpWpDynCreate: created new work process W17-18086

Sat Sep 21 07:44:31:268 2019


DpWpDynCreate: created new work process W16-18174

Sat Sep 21 07:46:04:459 2019


DpWpDynCreate: created new work process W20-18656

Sat Sep 21 07:46:05:518 2019


DpWpDynCreate: created new work process W19-18659

Sat Sep 21 07:48:20:253 2019


DpWpCheck: dyn W18, pid 17658 no longer needed, terminate now

Sat Sep 21 07:48:21:331 2019


DpHdlDeadWp: W18 (pid=17658) terminated automatically

Sat Sep 21 07:49:40:256 2019


DpWpCheck: dyn W16, pid 18174 no longer needed, terminate now
DpWpCheck: dyn W17, pid 18086 no longer needed, terminate now

Sat Sep 21 07:49:41:202 2019


DpHdlDeadWp: W16 (pid=18174) terminated automatically
DpHdlDeadWp: W17 (pid=18086) terminated automatically

Sat Sep 21 07:50:05:385 2019


DpWpDynCreate: created new work process W18-19968

Sat Sep 21 07:51:05:742 2019


DpHdlDeadWp: W20 (pid=18656) terminated automatically

Sat Sep 21 07:51:08:837 2019


DpWpDynCreate: created new work process W16-20426

Sat Sep 21 07:51:20:267 2019


DpWpCheck: dyn W19, pid 18659 no longer needed, terminate now

Sat Sep 21 07:51:20:862 2019


DpHdlDeadWp: W19 (pid=18659) terminated automatically

Sat Sep 21 07:53:37:237 2019


DpWpDynCreate: created new work process W17-21305

Sat Sep 21 07:53:40:716 2019


DpWpDynCreate: created new work process W20-21328

Sat Sep 21 07:55:09:635 2019


DpHdlDeadWp: W12 (pid=1258) terminated automatically
DpWpDynCreate: created new work process W12-21660
DpWpCheck: dyn W18, pid 19968 no longer needed, terminate now

Sat Sep 21 07:55:10:040 2019


DpHdlDeadWp: W18 (pid=19968) terminated automatically

Sat Sep 21 07:56:12:213 2019


DpHdlDeadWp: W16 (pid=20426) terminated automatically

Sat Sep 21 07:57:05:407 2019


DpWpDynCreate: created new work process W19-22294

Sat Sep 21 07:58:40:278 2019


DpWpCheck: dyn W17, pid 21305 no longer needed, terminate now
DpHdlDeadWp: W17 (pid=21305) terminated automatically

Sat Sep 21 07:59:00:278 2019


DpWpCheck: dyn W20, pid 21328 no longer needed, terminate now

Sat Sep 21 07:59:00:469 2019


DpHdlDeadWp: W20 (pid=21328) terminated automatically

Sat Sep 21 07:59:25:911 2019


DpWpDynCreate: created new work process W18-23128

Sat Sep 21 08:02:20:283 2019


DpWpCheck: dyn W19, pid 22294 no longer needed, terminate now

Sat Sep 21 08:02:20:648 2019


DpHdlDeadWp: W19 (pid=22294) terminated automatically

Sat Sep 21 08:03:28:476 2019


DpWpDynCreate: created new work process W16-31928

Sat Sep 21 08:03:52:445 2019


DpWpDynCreate: created new work process W17-686

Sat Sep 21 08:04:39:840 2019


DpHdlDeadWp: W18 (pid=23128) terminated automatically

Sat Sep 21 08:06:06:402 2019


DpWpDynCreate: created new work process W20-8431

Sat Sep 21 08:08:40:293 2019


DpWpCheck: dyn W16, pid 31928 no longer needed, terminate now

Sat Sep 21 08:08:41:044 2019


DpHdlDeadWp: W16 (pid=31928) terminated automatically

Sat Sep 21 08:09:00:294 2019


DpWpCheck: dyn W17, pid 686 no longer needed, terminate now

Sat Sep 21 08:09:00:763 2019


DpHdlDeadWp: W17 (pid=686) terminated automatically

Sat Sep 21 08:10:58:202 2019


DpWpDynCreate: created new work process W19-22501

Sat Sep 21 08:11:20:298 2019


DpWpCheck: dyn W20, pid 8431 no longer needed, terminate now

Sat Sep 21 08:11:21:278 2019


DpHdlDeadWp: W20 (pid=8431) terminated automatically

Sat Sep 21 08:16:00:305 2019


DpWpCheck: dyn W19, pid 22501 no longer needed, terminate now
Sat Sep 21 08:16:00:522 2019
DpHdlDeadWp: W19 (pid=22501) terminated automatically

Sat Sep 21 08:16:08:763 2019


DpWpDynCreate: created new work process W18-24116

Sat Sep 21 08:21:04:472 2019


DpWpDynCreate: created new work process W16-26201

Sat Sep 21 08:21:09:955 2019


DpHdlDeadWp: W18 (pid=24116) terminated automatically

Sat Sep 21 08:24:07:499 2019


DpWpDynCreate: created new work process W17-27030

Sat Sep 21 08:26:07:430 2019


DpHdlDeadWp: W16 (pid=26201) terminated automatically

Sat Sep 21 08:27:03:755 2019


DpWpDynCreate: created new work process W20-28070

Sat Sep 21 08:29:08:735 2019


DpHdlDeadWp: W17 (pid=27030) terminated automatically

Sat Sep 21 08:31:37:060 2019


DpWpDynCreate: created new work process W19-29585

Sat Sep 21 08:32:20:330 2019


DpWpCheck: dyn W20, pid 28070 no longer needed, terminate now

Sat Sep 21 08:32:20:970 2019


DpHdlDeadWp: W20 (pid=28070) terminated automatically

Sat Sep 21 08:33:03:744 2019


DpWpDynCreate: created new work process W18-29998

Sat Sep 21 08:36:40:337 2019


DpWpCheck: dyn W19, pid 29585 no longer needed, terminate now

Sat Sep 21 08:36:41:254 2019


DpHdlDeadWp: W19 (pid=29585) terminated automatically

Sat Sep 21 08:38:04:569 2019


DpHdlDeadWp: W18 (pid=29998) terminated automatically

Sat Sep 21 08:39:22:927 2019


DpWpDynCreate: created new work process W16-31865

Sat Sep 21 08:44:23:223 2019


DpHdlDeadWp: W16 (pid=31865) terminated automatically

Sat Sep 21 08:44:30:199 2019


DpWpDynCreate: created new work process W17-1293

Sat Sep 21 08:44:32:324 2019


DpWpDynCreate: created new work process W20-1299

Sat Sep 21 08:49:34:113 2019


DpWpCheck: dyn W17, pid 1293 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=1299) terminated automatically
DpHdlDeadWp: W17 (pid=1293) terminated automatically

Sat Sep 21 08:49:35:841 2019


DpWpDynCreate: created new work process W19-3250

Sat Sep 21 08:49:36:123 2019


DpWpDynCreate: created new work process W18-3252

Sat Sep 21 08:54:40:366 2019


DpWpCheck: dyn W18, pid 3252 no longer needed, terminate now
DpWpCheck: dyn W19, pid 3250 no longer needed, terminate now

Sat Sep 21 08:54:41:464 2019


DpHdlDeadWp: W18 (pid=3252) terminated automatically
DpHdlDeadWp: W19 (pid=3250) terminated automatically

Sat Sep 21 08:55:11:077 2019


DpWpDynCreate: created new work process W16-5087

Sat Sep 21 08:56:06:908 2019


DpWpDynCreate: created new work process W20-5333

Sat Sep 21 09:00:20:376 2019


DpWpCheck: dyn W16, pid 5087 no longer needed, terminate now

Sat Sep 21 09:00:20:841 2019


DpHdlDeadWp: W16 (pid=5087) terminated automatically

Sat Sep 21 09:00:38:829 2019


DpWpDynCreate: created new work process W17-7179

Sat Sep 21 09:01:20:377 2019


DpWpCheck: dyn W20, pid 5333 no longer needed, terminate now

Sat Sep 21 09:01:20:993 2019


DpHdlDeadWp: W20 (pid=5333) terminated automatically

Sat Sep 21 09:05:40:384 2019


DpWpCheck: dyn W17, pid 7179 no longer needed, terminate now

Sat Sep 21 09:05:41:302 2019


DpHdlDeadWp: W17 (pid=7179) terminated automatically

Sat Sep 21 09:09:23:653 2019


DpWpDynCreate: created new work process W18-5166

Sat Sep 21 09:11:06:459 2019


DpWpDynCreate: created new work process W19-5674

Sat Sep 21 09:14:28:515 2019


DpHdlDeadWp: W18 (pid=5166) terminated automatically

Sat Sep 21 09:16:20:402 2019


DpWpCheck: dyn W19, pid 5674 no longer needed, terminate now

Sat Sep 21 09:16:20:892 2019


DpHdlDeadWp: W19 (pid=5674) terminated automatically
Sat Sep 21 09:16:37:329 2019
DpWpDynCreate: created new work process W16-7926

Sat Sep 21 09:21:40:410 2019


DpWpCheck: dyn W16, pid 7926 no longer needed, terminate now

Sat Sep 21 09:21:41:290 2019


DpHdlDeadWp: W16 (pid=7926) terminated automatically

Sat Sep 21 09:22:12:797 2019


DpWpDynCreate: created new work process W20-9550

Sat Sep 21 09:27:16:707 2019


DpHdlDeadWp: W20 (pid=9550) terminated automatically

Sat Sep 21 09:27:34:133 2019


DpWpDynCreate: created new work process W17-11494

Sat Sep 21 09:32:40:430 2019


DpWpCheck: dyn W17, pid 11494 no longer needed, terminate now

Sat Sep 21 09:32:41:043 2019


DpHdlDeadWp: W17 (pid=11494) terminated automatically

Sat Sep 21 09:33:07:473 2019


DpWpDynCreate: created new work process W18-13319

Sat Sep 21 09:35:12:927 2019


DpWpDynCreate: created new work process W19-13865

Sat Sep 21 09:38:20:438 2019


DpWpCheck: dyn W18, pid 13319 no longer needed, terminate now

Sat Sep 21 09:38:21:437 2019


DpHdlDeadWp: W18 (pid=13319) terminated automatically

Sat Sep 21 09:39:06:222 2019


DpWpDynCreate: created new work process W16-15026

Sat Sep 21 09:40:13:799 2019


DpHdlDeadWp: W19 (pid=13865) terminated automatically

Sat Sep 21 09:40:28:173 2019


DpWpDynCreate: created new work process W20-15488

Sat Sep 21 09:44:13:141 2019


DpHdlDeadWp: W16 (pid=15026) terminated automatically

Sat Sep 21 09:45:29:450 2019


DpWpDynCreate: created new work process W17-17015

Sat Sep 21 09:45:32:471 2019


DpHdlDeadWp: W20 (pid=15488) terminated automatically

Sat Sep 21 09:45:36:727 2019


DpWpDynCreate: created new work process W18-17123

Sat Sep 21 09:48:08:421 2019


DpWpDynCreate: created new work process W19-17948

Sat Sep 21 09:50:38:206 2019


DpWpCheck: dyn W17, pid 17015 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=17123) terminated automatically

Sat Sep 21 09:50:40:232 2019


DpHdlDeadWp: W17 (pid=17015) terminated automatically

Sat Sep 21 09:53:09:516 2019


DpHdlDeadWp: W19 (pid=17948) terminated automatically

Sat Sep 21 09:55:07:132 2019


DpWpDynCreate: created new work process W16-20186

Sat Sep 21 09:55:30:532 2019


DpWpDynCreate: created new work process W20-20270

Sat Sep 21 10:00:09:885 2019


DpHdlDeadWp: W16 (pid=20186) terminated automatically

Sat Sep 21 10:00:31:470 2019


DpHdlDeadWp: W20 (pid=20270) terminated automatically

Sat Sep 21 10:04:12:284 2019


DpWpDynCreate: created new work process W18-4805

Sat Sep 21 10:09:14:579 2019


DpHdlDeadWp: W18 (pid=4805) terminated automatically

Sat Sep 21 10:10:08:836 2019


DpWpDynCreate: created new work process W17-20791

Sat Sep 21 10:10:30:740 2019


DpWpDynCreate: created new work process W19-20886

Sat Sep 21 10:15:12:932 2019


DpHdlDeadWp: W17 (pid=20791) terminated automatically

Sat Sep 21 10:15:31:522 2019


DpHdlDeadWp: W19 (pid=20886) terminated automatically

Sat Sep 21 10:15:34:000 2019


DpWpDynCreate: created new work process W16-22518

Sat Sep 21 10:19:06:049 2019


DpWpDynCreate: created new work process W20-23679

Sat Sep 21 10:20:40:512 2019


DpWpCheck: dyn W16, pid 22518 no longer needed, terminate now

Sat Sep 21 10:20:41:169 2019


DpHdlDeadWp: W16 (pid=22518) terminated automatically

Sat Sep 21 10:24:20:518 2019


DpWpCheck: dyn W20, pid 23679 no longer needed, terminate now

Sat Sep 21 10:24:21:451 2019


DpHdlDeadWp: W20 (pid=23679) terminated automatically
Sat Sep 21 10:25:06:271 2019
DpWpDynCreate: created new work process W18-26032

Sat Sep 21 10:30:07:762 2019


DpHdlDeadWp: W18 (pid=26032) terminated automatically

Sat Sep 21 10:30:26:559 2019


DpWpDynCreate: created new work process W17-27653

Sat Sep 21 10:35:40:236 2019


DpHdlDeadWp: W17 (pid=27653) terminated automatically

Sat Sep 21 10:39:11:067 2019


DpWpDynCreate: created new work process W19-30585

Sat Sep 21 10:44:13:150 2019


DpHdlDeadWp: W19 (pid=30585) terminated automatically

Sat Sep 21 10:45:17:294 2019


DpWpDynCreate: created new work process W16-32444

Sat Sep 21 10:45:17:458 2019


DpWpDynCreate: created new work process W20-32445

Sat Sep 21 10:47:36:178 2019


DpWpDynCreate: created new work process W18-973

Sat Sep 21 10:50:19:520 2019


DpHdlDeadWp: W16 (pid=32444) terminated automatically
DpWpCheck: dyn W20, pid 32445 no longer needed, terminate now

Sat Sep 21 10:50:20:050 2019


DpHdlDeadWp: W20 (pid=32445) terminated automatically

Sat Sep 21 10:50:21:541 2019


DpWpDynCreate: created new work process W17-1627

Sat Sep 21 10:50:25:847 2019


DpWpDynCreate: created new work process W19-1637

Sat Sep 21 10:52:40:570 2019


DpWpCheck: dyn W18, pid 973 no longer needed, terminate now

Sat Sep 21 10:52:40:799 2019


DpHdlDeadWp: W18 (pid=973) terminated automatically

Sat Sep 21 10:54:06:469 2019


DpWpDynCreate: created new work process W16-3047

Sat Sep 21 10:55:25:054 2019


DpHdlDeadWp: W17 (pid=1627) terminated automatically

Sat Sep 21 10:55:29:344 2019


DpHdlDeadWp: W19 (pid=1637) terminated automatically

Sat Sep 21 10:55:33:696 2019


DpWpDynCreate: created new work process W20-3598
Sat Sep 21 10:59:08:587 2019
DpHdlDeadWp: W16 (pid=3047) terminated automatically

Sat Sep 21 11:00:05:995 2019


DpWpDynCreate: created new work process W18-5144

Sat Sep 21 11:00:40:585 2019


DpWpCheck: dyn W20, pid 3598 no longer needed, terminate now

Sat Sep 21 11:00:41:671 2019


DpHdlDeadWp: W20 (pid=3598) terminated automatically

Sat Sep 21 11:04:06:042 2019


DpWpDynCreate: created new work process W17-17720

Sat Sep 21 11:05:06:768 2019


DpHdlDeadWp: W18 (pid=5144) terminated automatically

Sat Sep 21 11:05:07:887 2019


DpWpDynCreate: created new work process W19-20112

Sat Sep 21 11:09:07:369 2019


DpHdlDeadWp: W17 (pid=17720) terminated automatically

Sat Sep 21 11:09:11:238 2019


DpWpDynCreate: created new work process W16-3432

Sat Sep 21 11:10:08:391 2019


DpWpDynCreate: created new work process W20-3734

Sat Sep 21 11:10:09:686 2019


DpHdlDeadWp: W19 (pid=20112) terminated automatically

Sat Sep 21 11:12:03:979 2019


DpWpDynCreate: created new work process W18-4416

Sat Sep 21 11:12:16:406 2019


DpWpDynCreate: created new work process W17-4478

Sat Sep 21 11:14:12:741 2019


DpHdlDeadWp: W16 (pid=3432) terminated automatically

Sat Sep 21 11:15:20:607 2019


DpWpCheck: dyn W20, pid 3734 no longer needed, terminate now

Sat Sep 21 11:15:21:478 2019


DpHdlDeadWp: W20 (pid=3734) terminated automatically

Sat Sep 21 11:16:27:715 2019


DpWpDynCreate: created new work process W19-6114

Sat Sep 21 11:17:20:610 2019


DpWpCheck: dyn W17, pid 4478 no longer needed, terminate now
DpWpCheck: dyn W18, pid 4416 no longer needed, terminate now

Sat Sep 21 11:17:21:648 2019


DpHdlDeadWp: W17 (pid=4478) terminated automatically
DpHdlDeadWp: W18 (pid=4416) terminated automatically
Sat Sep 21 11:19:06:151 2019
DpWpDynCreate: created new work process W16-7381

Sat Sep 21 11:21:06:590 2019


DpWpDynCreate: created new work process W20-8061

Sat Sep 21 11:21:28:304 2019


DpHdlDeadWp: W19 (pid=6114) terminated automatically

Sat Sep 21 11:24:12:229 2019


DpHdlDeadWp: W16 (pid=7381) terminated automatically

Sat Sep 21 11:26:09:329 2019


DpHdlDeadWp: W20 (pid=8061) terminated automatically

Sat Sep 21 11:26:09:527 2019


DpWpDynCreate: created new work process W17-9791

Sat Sep 21 11:26:14:060 2019


DpWpDynCreate: created new work process W18-9795

Sat Sep 21 11:26:28:637 2019


DpWpDynCreate: created new work process W19-9804

Sat Sep 21 11:31:10:615 2019


DpHdlDeadWp: W17 (pid=9791) terminated automatically

Sat Sep 21 11:31:25:469 2019


DpHdlDeadWp: W18 (pid=9795) terminated automatically

Sat Sep 21 11:31:30:720 2019


DpHdlDeadWp: W19 (pid=9804) terminated automatically

Sat Sep 21 11:31:50:548 2019


DpWpDynCreate: created new work process W16-11632

Sat Sep 21 11:31:54:300 2019


DpWpDynCreate: created new work process W20-11641

Sat Sep 21 11:36:25:402 2019


DpWpDynCreate: created new work process W17-12945

Sat Sep 21 11:37:00:639 2019


DpWpCheck: dyn W16, pid 11632 no longer needed, terminate now
DpWpCheck: dyn W20, pid 11641 no longer needed, terminate now

Sat Sep 21 11:37:01:049 2019


DpHdlDeadWp: W16 (pid=11632) terminated automatically
DpHdlDeadWp: W20 (pid=11641) terminated automatically

Sat Sep 21 11:40:05:203 2019


DpWpDynCreate: created new work process W18-14038

Sat Sep 21 11:41:08:652 2019


DpWpDynCreate: created new work process W19-14332

Sat Sep 21 11:41:26:744 2019


DpHdlDeadWp: W17 (pid=12945) terminated automatically
Sat Sep 21 11:45:20:658 2019
DpWpCheck: dyn W18, pid 14038 no longer needed, terminate now

Sat Sep 21 11:45:21:541 2019


DpHdlDeadWp: W18 (pid=14038) terminated automatically

Sat Sep 21 11:46:11:068 2019


DpHdlDeadWp: W19 (pid=14332) terminated automatically

Sat Sep 21 11:46:17:492 2019


DpWpDynCreate: created new work process W16-16089

Sat Sep 21 11:46:37:051 2019


DpWpDynCreate: created new work process W20-16181

Sat Sep 21 11:51:18:311 2019


DpHdlDeadWp: W16 (pid=16089) terminated automatically

Sat Sep 21 11:51:26:200 2019


DpWpDynCreate: created new work process W17-17561

Sat Sep 21 11:51:40:668 2019


DpWpCheck: dyn W20, pid 16181 no longer needed, terminate now

Sat Sep 21 11:51:41:424 2019


DpHdlDeadWp: W20 (pid=16181) terminated automatically

Sat Sep 21 11:56:14:971 2019


DpWpDynCreate: created new work process W18-19229

Sat Sep 21 11:56:15:248 2019


DpWpDynCreate: created new work process W19-19230

Sat Sep 21 11:56:40:678 2019


DpWpCheck: dyn W17, pid 17561 no longer needed, terminate now

Sat Sep 21 11:56:41:830 2019


DpHdlDeadWp: W17 (pid=17561) terminated automatically

Sat Sep 21 12:01:15:881 2019


DpHdlDeadWp: W18 (pid=19229) terminated automatically

Sat Sep 21 12:01:19:342 2019


DpHdlDeadWp: W19 (pid=19230) terminated automatically

Sat Sep 21 12:01:33:346 2019


DpWpDynCreate: created new work process W16-21768

Sat Sep 21 12:06:28:870 2019


DpWpDynCreate: created new work process W20-7323

Sat Sep 21 12:06:29:037 2019


DpWpDynCreate: created new work process W17-7325

Sat Sep 21 12:06:34:732 2019


DpHdlDeadWp: W16 (pid=21768) terminated automatically

Sat Sep 21 12:11:18:320 2019


DpWpDynCreate: created new work process W18-19503
Sat Sep 21 12:11:40:708 2019
DpWpCheck: dyn W17, pid 7325 no longer needed, terminate now
DpWpCheck: dyn W20, pid 7323 no longer needed, terminate now

Sat Sep 21 12:11:41:228 2019


DpHdlDeadWp: W17 (pid=7325) terminated automatically
DpHdlDeadWp: W20 (pid=7323) terminated automatically

Sat Sep 21 12:15:11:201 2019


DpWpDynCreate: created new work process W19-21169

Sat Sep 21 12:16:19:538 2019


DpHdlDeadWp: W18 (pid=19503) terminated automatically

Sat Sep 21 12:16:22:832 2019


DpWpDynCreate: created new work process W16-21473

Sat Sep 21 12:16:24:757 2019


DpWpDynCreate: created new work process W17-21476

Sat Sep 21 12:20:13:390 2019


DpHdlDeadWp: W19 (pid=21169) terminated automatically

Sat Sep 21 12:21:23:279 2019


DpWpDynCreate: created new work process W20-23282

Sat Sep 21 12:21:23:499 2019


DpHdlDeadWp: W16 (pid=21473) terminated automatically
DpWpDynCreate: created new work process W18-23283

Sat Sep 21 12:21:25:505 2019


DpHdlDeadWp: W17 (pid=21476) terminated automatically

Sat Sep 21 12:25:06:094 2019


DpWpDynCreate: created new work process W19-24751

Sat Sep 21 12:26:26:498 2019


DpWpCheck: dyn W18, pid 23283 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=23282) terminated automatically
DpHdlDeadWp: W18 (pid=23283) terminated automatically

Sat Sep 21 12:27:05:030 2019


DpWpDynCreate: created new work process W16-25642

Sat Sep 21 12:30:07:475 2019


DpHdlDeadWp: W19 (pid=24751) terminated automatically

Sat Sep 21 12:30:24:621 2019


DpWpDynCreate: created new work process W17-26673

Sat Sep 21 12:32:07:515 2019


DpHdlDeadWp: W16 (pid=25642) terminated automatically
DpWpDynCreate: created new work process W16-27290

Sat Sep 21 12:32:07:649 2019


DpWpDynCreate: created new work process W20-27291

Sat Sep 21 12:35:27:722 2019


DpHdlDeadWp: W17 (pid=26673) terminated automatically

Sat Sep 21 12:37:08:408 2019


DpHdlDeadWp: W20 (pid=27291) terminated automatically

Sat Sep 21 12:37:08:548 2019


DpHdlDeadWp: W16 (pid=27290) terminated automatically

Sat Sep 21 12:37:09:585 2019


DpWpDynCreate: created new work process W18-28876

Sat Sep 21 12:37:17:230 2019


DpWpDynCreate: created new work process W19-28883

Sat Sep 21 12:42:12:222 2019


DpHdlDeadWp: W18 (pid=28876) terminated automatically

Sat Sep 21 12:42:20:763 2019


DpWpCheck: dyn W19, pid 28883 no longer needed, terminate now

Sat Sep 21 12:42:21:294 2019


DpHdlDeadWp: W19 (pid=28883) terminated automatically

Sat Sep 21 12:42:23:065 2019


DpWpDynCreate: created new work process W17-30624

Sat Sep 21 12:47:29:637 2019


DpHdlDeadWp: W17 (pid=30624) terminated automatically

Sat Sep 21 12:51:09:934 2019


DpWpDynCreate: created new work process W20-1105

Sat Sep 21 12:51:16:029 2019


DpWpDynCreate: created new work process W16-1145

Sat Sep 21 12:52:24:673 2019


DpWpDynCreate: created new work process W18-1449

Sat Sep 21 12:56:11:165 2019


DpHdlDeadWp: W20 (pid=1105) terminated automatically

Sat Sep 21 12:56:20:786 2019


DpWpCheck: dyn W16, pid 1145 no longer needed, terminate now

Sat Sep 21 12:56:21:261 2019


DpHdlDeadWp: W16 (pid=1145) terminated automatically

Sat Sep 21 12:57:14:740 2019


DpWpDynCreate: created new work process W19-3110

Sat Sep 21 12:57:15:883 2019


DpWpDynCreate: created new work process W17-3113

Sat Sep 21 12:57:25:236 2019


DpHdlDeadWp: W18 (pid=1449) terminated automatically

Sat Sep 21 13:02:17:828 2019


DpHdlDeadWp: W17 (pid=3113) terminated automatically
DpHdlDeadWp: W19 (pid=3110) terminated automatically
Sat Sep 21 13:06:38:119 2019
DpWpDynCreate: created new work process W20-23066

Sat Sep 21 13:11:40:825 2019


DpWpCheck: dyn W20, pid 23066 no longer needed, terminate now

Sat Sep 21 13:11:41:483 2019


DpHdlDeadWp: W20 (pid=23066) terminated automatically

Sat Sep 21 13:16:28:737 2019


DpWpDynCreate: created new work process W16-4830

Sat Sep 21 13:18:34:116 2019


DpWpDynCreate: created new work process W18-5624

Sat Sep 21 13:18:52:790 2019


DpWpDynCreate: created new work process W17-5803

Sat Sep 21 13:21:40:843 2019


DpWpCheck: dyn W16, pid 4830 no longer needed, terminate now

Sat Sep 21 13:21:42:153 2019


DpHdlDeadWp: W16 (pid=4830) terminated automatically

Sat Sep 21 13:23:40:847 2019


DpWpCheck: dyn W18, pid 5624 no longer needed, terminate now

Sat Sep 21 13:23:41:458 2019


DpHdlDeadWp: W18 (pid=5624) terminated automatically

Sat Sep 21 13:24:00:847 2019


DpWpCheck: dyn W17, pid 5803 no longer needed, terminate now

Sat Sep 21 13:24:01:557 2019


DpHdlDeadWp: W17 (pid=5803) terminated automatically

Sat Sep 21 13:24:15:645 2019


DpWpDynCreate: created new work process W19-7462

Sat Sep 21 13:29:16:421 2019


DpHdlDeadWp: W19 (pid=7462) terminated automatically

Sat Sep 21 13:29:25:988 2019


DpWpDynCreate: created new work process W20-9235

Sat Sep 21 13:30:11:463 2019


DpWpDynCreate: created new work process W16-9542

Sat Sep 21 13:34:27:863 2019


DpHdlDeadWp: W20 (pid=9235) terminated automatically

Sat Sep 21 13:34:31:996 2019


DpWpDynCreate: created new work process W18-10836

Sat Sep 21 13:35:12:683 2019


DpHdlDeadWp: W16 (pid=9542) terminated automatically

Sat Sep 21 13:37:07:848 2019


DpWpDynCreate: created new work process W17-11787

Sat Sep 21 13:39:35:553 2019


DpHdlDeadWp: W18 (pid=10836) terminated automatically

Sat Sep 21 13:41:08:686 2019


DpWpDynCreate: created new work process W19-13014

Sat Sep 21 13:42:11:280 2019


DpHdlDeadWp: W17 (pid=11787) terminated automatically

Sat Sep 21 13:44:26:192 2019


DpWpDynCreate: created new work process W20-13895

Sat Sep 21 13:46:12:029 2019


DpHdlDeadWp: W19 (pid=13014) terminated automatically

Sat Sep 21 13:47:17:242 2019


DpWpDynCreate: created new work process W16-14910

Sat Sep 21 13:49:27:561 2019


DpHdlDeadWp: W20 (pid=13895) terminated automatically

Sat Sep 21 13:52:09:417 2019


DpWpDynCreate: created new work process W18-16407

Sat Sep 21 13:52:13:824 2019


DpWpDynCreate: created new work process W17-16414

Sat Sep 21 13:52:20:891 2019


DpWpCheck: dyn W16, pid 14910 no longer needed, terminate now

Sat Sep 21 13:52:21:462 2019


DpHdlDeadWp: W16 (pid=14910) terminated automatically

Sat Sep 21 13:57:35:290 2019


DpHdlDeadWp: W17 (pid=16414) terminated automatically
DpHdlDeadWp: W18 (pid=16407) terminated automatically

Sat Sep 21 13:57:36:100 2019


DpSendLoadInfo: quota for load / queue fill level = 7.200000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 7.896410 /
0.035714

Sat Sep 21 13:57:36:836 2019


DpHdlDeadWp: W13 (pid=17305) terminated automatically
DpWpDynCreate: created new work process W13-18331

Sat Sep 21 13:57:39:102 2019


DpSendLoadInfo: queue DIA no longer with high load

Sat Sep 21 13:59:05:739 2019


DpWpDynCreate: created new work process W19-19017

Sat Sep 21 14:00:27:401 2019


DpWpDynCreate: created new work process W20-19727

Sat Sep 21 14:01:34:983 2019


DpWpDynCreate: created new work process W16-20827
Sat Sep 21 14:04:07:658 2019
DpHdlDeadWp: W19 (pid=19017) terminated automatically

Sat Sep 21 14:05:28:521 2019


DpHdlDeadWp: W20 (pid=19727) terminated automatically

Sat Sep 21 14:05:37:813 2019


DpWpDynCreate: created new work process W17-30564

Sat Sep 21 14:06:41:407 2019


DpWpCheck: dyn W16, pid 20827 no longer needed, terminate now

Sat Sep 21 14:06:41:787 2019


DpHdlDeadWp: W16 (pid=20827) terminated automatically

Sat Sep 21 14:10:41:412 2019


DpWpCheck: dyn W17, pid 30564 no longer needed, terminate now

Sat Sep 21 14:10:42:235 2019


DpHdlDeadWp: W17 (pid=30564) terminated automatically

Sat Sep 21 14:16:36:394 2019


DpWpDynCreate: created new work process W18-20887

Sat Sep 21 14:21:38:236 2019


DpHdlDeadWp: W18 (pid=20887) terminated automatically

Sat Sep 21 14:25:06:310 2019


DpWpDynCreate: created new work process W19-23854

Sat Sep 21 14:30:08:665 2019


DpHdlDeadWp: W19 (pid=23854) terminated automatically

Sat Sep 21 14:30:19:210 2019


DpWpDynCreate: created new work process W20-25702

Sat Sep 21 14:35:06:039 2019


DpWpDynCreate: created new work process W16-27327

Sat Sep 21 14:35:21:796 2019


DpHdlDeadWp: W20 (pid=25702) terminated automatically

Sat Sep 21 14:37:40:168 2019


DpWpDynCreate: created new work process W17-28131

Sat Sep 21 14:40:21:804 2019


DpWpCheck: dyn W16, pid 27327 no longer needed, terminate now

Sat Sep 21 14:40:22:431 2019


DpHdlDeadWp: W16 (pid=27327) terminated automatically

Sat Sep 21 14:41:12:547 2019


DpWpDynCreate: created new work process W18-29146

Sat Sep 21 14:42:41:809 2019


DpWpCheck: dyn W17, pid 28131 no longer needed, terminate now

Sat Sep 21 14:42:42:602 2019


DpHdlDeadWp: W17 (pid=28131) terminated automatically

Sat Sep 21 14:44:04:819 2019


DpWpDynCreate: created new work process W19-30124

Sat Sep 21 14:46:21:817 2019


DpWpCheck: dyn W18, pid 29146 no longer needed, terminate now

Sat Sep 21 14:46:22:647 2019


DpHdlDeadWp: W18 (pid=29146) terminated automatically

Sat Sep 21 14:47:11:608 2019


DpWpDynCreate: created new work process W20-31166

Sat Sep 21 14:47:39:138 2019


DpWpDynCreate: created new work process W16-31267

Sat Sep 21 14:49:21:821 2019


DpWpCheck: dyn W19, pid 30124 no longer needed, terminate now

Sat Sep 21 14:49:22:494 2019


DpHdlDeadWp: W19 (pid=30124) terminated automatically

Sat Sep 21 14:49:40:188 2019


DpWpDynCreate: created new work process W17-31921

Sat Sep 21 14:52:12:707 2019


DpHdlDeadWp: W20 (pid=31166) terminated automatically

Sat Sep 21 14:52:27:194 2019


DpWpDynCreate: created new work process W18-363

Sat Sep 21 14:52:29:272 2019


DpWpDynCreate: created new work process W19-462

Sat Sep 21 14:52:41:826 2019


DpWpCheck: dyn W16, pid 31267 no longer needed, terminate now

Sat Sep 21 14:52:42:174 2019


DpHdlDeadWp: W16 (pid=31267) terminated automatically

Sat Sep 21 14:54:41:259 2019


DpHdlDeadWp: W17 (pid=31921) terminated automatically

Sat Sep 21 14:56:07:589 2019


DpWpDynCreate: created new work process W20-1666

Sat Sep 21 14:57:29:756 2019


DpHdlDeadWp: W18 (pid=363) terminated automatically

Sat Sep 21 14:57:32:765 2019


DpHdlDeadWp: W19 (pid=462) terminated automatically

Sat Sep 21 14:59:10:235 2019


DpWpDynCreate: created new work process W16-2790

Sat Sep 21 15:01:09:715 2019


DpHdlDeadWp: W20 (pid=1666) terminated automatically
Sat Sep 21 15:02:22:145 2019
DpWpDynCreate: created new work process W17-7353

Sat Sep 21 15:04:21:846 2019


DpWpCheck: dyn W16, pid 2790 no longer needed, terminate now

Sat Sep 21 15:04:22:839 2019


DpHdlDeadWp: W16 (pid=2790) terminated automatically

Sat Sep 21 15:06:05:987 2019


DpWpDynCreate: created new work process W18-21372

Sat Sep 21 15:07:28:877 2019


DpHdlDeadWp: W17 (pid=7353) terminated automatically

Sat Sep 21 15:10:11:248 2019


DpWpDynCreate: created new work process W19-1641

Sat Sep 21 15:11:06:193 2019


DpHdlDeadWp: W18 (pid=21372) terminated automatically

Sat Sep 21 15:12:13:887 2019


DpWpDynCreate: created new work process W20-2601

Sat Sep 21 15:14:34:649 2019


DpWpDynCreate: created new work process W16-3502

Sat Sep 21 15:15:21:864 2019


DpWpCheck: dyn W19, pid 1641 no longer needed, terminate now

Sat Sep 21 15:15:22:422 2019


DpHdlDeadWp: W19 (pid=1641) terminated automatically

Sat Sep 21 15:17:17:068 2019


DpHdlDeadWp: W20 (pid=2601) terminated automatically

Sat Sep 21 15:17:24:510 2019


DpWpDynCreate: created new work process W17-4243

Sat Sep 21 15:19:41:871 2019


DpWpCheck: dyn W16, pid 3502 no longer needed, terminate now

Sat Sep 21 15:19:42:655 2019


DpHdlDeadWp: W16 (pid=3502) terminated automatically

Sat Sep 21 15:22:25:857 2019


DpHdlDeadWp: W17 (pid=4243) terminated automatically

Sat Sep 21 15:22:28:332 2019


DpWpDynCreate: created new work process W18-6103

Sat Sep 21 15:22:28:763 2019


DpWpDynCreate: created new work process W19-6104

Sat Sep 21 15:22:28:872 2019


DpWpDynCreate: created new work process W20-6105

Sat Sep 21 15:27:41:884 2019


DpWpCheck: dyn W18, pid 6103 no longer needed, terminate now
DpWpCheck: dyn W19, pid 6104 no longer needed, terminate now
DpWpCheck: dyn W20, pid 6105 no longer needed, terminate now

Sat Sep 21 15:27:42:203 2019


DpHdlDeadWp: W19 (pid=6104) terminated automatically
DpHdlDeadWp: W20 (pid=6105) terminated automatically

Sat Sep 21 15:27:43:096 2019


DpHdlDeadWp: W18 (pid=6103) terminated automatically

Sat Sep 21 15:29:34:539 2019


DpWpDynCreate: created new work process W16-8407

Sat Sep 21 15:31:04:132 2019


DpWpDynCreate: created new work process W17-8909

Sat Sep 21 15:31:15:224 2019


DpWpDynCreate: created new work process W19-9117

Sat Sep 21 15:34:41:896 2019


DpWpCheck: dyn W16, pid 8407 no longer needed, terminate now

Sat Sep 21 15:34:42:625 2019


DpHdlDeadWp: W16 (pid=8407) terminated automatically

Sat Sep 21 15:36:05:914 2019


DpHdlDeadWp: W17 (pid=8909) terminated automatically

Sat Sep 21 15:36:16:982 2019


DpHdlDeadWp: W19 (pid=9117) terminated automatically

Sat Sep 21 15:36:17:003 2019


DpWpDynCreate: created new work process W20-10548

Sat Sep 21 15:41:21:906 2019


DpWpCheck: dyn W20, pid 10548 no longer needed, terminate now

Sat Sep 21 15:41:22:317 2019


DpHdlDeadWp: W20 (pid=10548) terminated automatically

Sat Sep 21 15:42:11:896 2019


DpWpDynCreate: created new work process W18-12567

Sat Sep 21 15:43:07:782 2019


DpWpDynCreate: created new work process W16-12801

Sat Sep 21 15:43:10:022 2019


DpWpDynCreate: created new work process W17-12806

Sat Sep 21 15:47:19:345 2019


DpHdlDeadWp: W18 (pid=12567) terminated automatically

Sat Sep 21 15:48:21:919 2019


DpWpCheck: dyn W16, pid 12801 no longer needed, terminate now
DpWpCheck: dyn W17, pid 12806 no longer needed, terminate now

Sat Sep 21 15:48:22:490 2019


DpHdlDeadWp: W16 (pid=12801) terminated automatically
Sat Sep 21 15:48:23:512 2019
DpHdlDeadWp: W17 (pid=12806) terminated automatically

Sat Sep 21 15:49:16:756 2019


DpWpDynCreate: created new work process W19-14634

Sat Sep 21 15:52:11:859 2019


DpWpDynCreate: created new work process W20-15665

Sat Sep 21 15:52:12:039 2019


DpWpDynCreate: created new work process W18-15666

Sat Sep 21 15:52:12:261 2019


DpWpDynCreate: created new work process W16-15667

Sat Sep 21 15:54:18:728 2019


DpHdlDeadWp: W19 (pid=14634) terminated automatically

Sat Sep 21 15:57:20:041 2019


DpWpCheck: dyn W16, pid 15667 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=15666) terminated automatically
DpWpCheck: dyn W20, pid 15665 no longer needed, terminate now

Sat Sep 21 15:57:21:132 2019


DpHdlDeadWp: W16 (pid=15667) terminated automatically
DpHdlDeadWp: W20 (pid=15665) terminated automatically

Sat Sep 21 15:59:30:111 2019


DpWpDynCreate: created new work process W17-17755

Sat Sep 21 16:04:33:520 2019


DpHdlDeadWp: W17 (pid=17755) terminated automatically

Sat Sep 21 16:07:15:508 2019


DpWpDynCreate: created new work process W19-8667

Sat Sep 21 16:09:19:163 2019


DpWpDynCreate: created new work process W18-16380

Sat Sep 21 16:12:21:961 2019


DpWpCheck: dyn W19, pid 8667 no longer needed, terminate now

Sat Sep 21 16:12:23:011 2019


DpHdlDeadWp: W19 (pid=8667) terminated automatically

Sat Sep 21 16:14:21:138 2019


DpHdlDeadWp: W18 (pid=16380) terminated automatically

Sat Sep 21 16:16:08:865 2019


DpWpDynCreate: created new work process W16-18576

Sat Sep 21 16:21:21:976 2019


DpWpCheck: dyn W16, pid 18576 no longer needed, terminate now

Sat Sep 21 16:21:22:556 2019


DpHdlDeadWp: W16 (pid=18576) terminated automatically

Sat Sep 21 16:22:17:775 2019


DpWpDynCreate: created new work process W20-21003
Sat Sep 21 16:25:26:366 2019
DpWpDynCreate: created new work process W17-21928

Sat Sep 21 16:27:21:988 2019


DpWpCheck: dyn W20, pid 21003 no longer needed, terminate now

Sat Sep 21 16:27:22:960 2019


DpHdlDeadWp: W20 (pid=21003) terminated automatically

Sat Sep 21 16:30:32:063 2019


DpHdlDeadWp: W17 (pid=21928) terminated automatically

Sat Sep 21 16:34:06:590 2019


DpWpDynCreate: created new work process W19-25051

Sat Sep 21 16:36:11:632 2019


DpWpDynCreate: created new work process W18-25744

Sat Sep 21 16:36:11:969 2019


DpWpDynCreate: created new work process W16-25751

Sat Sep 21 16:39:11:934 2019


DpHdlDeadWp: W19 (pid=25051) terminated automatically

Sat Sep 21 16:41:13:937 2019


DpHdlDeadWp: W16 (pid=25751) terminated automatically

Sat Sep 21 16:41:22:610 2019


DpHdlDeadWp: W18 (pid=25744) terminated automatically

Sat Sep 21 16:41:24:928 2019


DpWpDynCreate: created new work process W20-27429

Sat Sep 21 16:41:37:966 2019


DpWpDynCreate: created new work process W17-27512

Sat Sep 21 16:44:08:618 2019


DpWpDynCreate: created new work process W19-28238

Sat Sep 21 16:46:30:371 2019


DpHdlDeadWp: W20 (pid=27429) terminated automatically

Sat Sep 21 16:46:42:021 2019


DpWpCheck: dyn W17, pid 27512 no longer needed, terminate now

Sat Sep 21 16:46:42:481 2019


DpHdlDeadWp: W17 (pid=27512) terminated automatically

Sat Sep 21 16:47:06:852 2019


DpWpDynCreate: created new work process W16-29172

Sat Sep 21 16:48:08:949 2019


DpWpDynCreate: created new work process W18-29674

Sat Sep 21 16:49:22:025 2019


DpWpCheck: dyn W19, pid 28238 no longer needed, terminate now

Sat Sep 21 16:49:22:618 2019


DpHdlDeadWp: W19 (pid=28238) terminated automatically

Sat Sep 21 16:52:22:031 2019


DpWpCheck: dyn W16, pid 29172 no longer needed, terminate now

Sat Sep 21 16:52:22:779 2019


DpHdlDeadWp: W16 (pid=29172) terminated automatically

Sat Sep 21 16:53:22:032 2019


DpWpCheck: dyn W18, pid 29674 no longer needed, terminate now

Sat Sep 21 16:53:22:920 2019


DpHdlDeadWp: W18 (pid=29674) terminated automatically

Sat Sep 21 16:57:17:145 2019


DpWpDynCreate: created new work process W20-342

Sat Sep 21 16:59:10:927 2019


DpWpDynCreate: created new work process W17-839

Sat Sep 21 17:02:20:288 2019


DpHdlDeadWp: W20 (pid=342) terminated automatically

Sat Sep 21 17:02:34:744 2019


DpWpDynCreate: created new work process W19-7238

Sat Sep 21 17:04:12:461 2019


DpHdlDeadWp: W17 (pid=839) terminated automatically

Sat Sep 21 17:07:29:915 2019


DpWpDynCreate: created new work process W16-26425

Sat Sep 21 17:07:35:761 2019


DpHdlDeadWp: W19 (pid=7238) terminated automatically

Sat Sep 21 17:09:04:826 2019


DpWpDynCreate: created new work process W18-31784

Sat Sep 21 17:12:32:980 2019


DpHdlDeadWp: W16 (pid=26425) terminated automatically

Sat Sep 21 17:14:21:162 2019


DpHdlDeadWp: W18 (pid=31784) terminated automatically

Sat Sep 21 17:15:43:054 2019


DpWpDynCreate: created new work process W20-1749

Sat Sep 21 17:21:02:467 2019


DpHdlDeadWp: W11 (pid=10570) terminated automatically
DpWpDynCreate: created new work process W11-3597
DpWpCheck: dyn W20, pid 1749 no longer needed, terminate now

Sat Sep 21 17:21:02:673 2019


DpWpDynCreate: created new work process W17-3598
DpHdlDeadWp: W20 (pid=1749) terminated automatically
DpWpDynCreate: created new work process W19-3599

Sat Sep 21 17:21:03:424 2019


DpWpDynCreate: created new work process W16-3604
Sat Sep 21 17:22:49:663 2019
DpHdlDeadWp: W11 (pid=3597) terminated automatically
DpWpDynCreate: created new work process W11-4312

Sat Sep 21 17:26:03:437 2019


DpHdlDeadWp: W19 (pid=3599) terminated automatically

Sat Sep 21 17:26:03:955 2019


DpHdlDeadWp: W17 (pid=3598) terminated automatically

Sat Sep 21 17:26:05:047 2019


DpHdlDeadWp: W16 (pid=3604) terminated automatically

Sat Sep 21 17:26:22:067 2019


DpWpDynCreate: created new work process W18-5553

Sat Sep 21 17:31:42:483 2019


DpWpCheck: dyn W18, pid 5553 no longer needed, terminate now

Sat Sep 21 17:31:43:408 2019


DpHdlDeadWp: W18 (pid=5553) terminated automatically

Sat Sep 21 17:32:07:432 2019


DpWpDynCreate: created new work process W20-7377

Sat Sep 21 17:32:40:760 2019


DpWpDynCreate: created new work process W19-7803

Sat Sep 21 17:37:11:844 2019


DpHdlDeadWp: W20 (pid=7377) terminated automatically

Sat Sep 21 17:37:25:139 2019


DpWpDynCreate: created new work process W17-9273

Sat Sep 21 17:37:47:735 2019


DpHdlDeadWp: W19 (pid=7803) terminated automatically

Sat Sep 21 17:42:42:501 2019


DpWpCheck: dyn W17, pid 9273 no longer needed, terminate now

Sat Sep 21 17:42:43:114 2019


DpHdlDeadWp: W17 (pid=9273) terminated automatically

Sat Sep 21 17:47:14:337 2019


DpWpDynCreate: created new work process W16-12354

Sat Sep 21 17:52:20:234 2019


DpHdlDeadWp: W16 (pid=12354) terminated automatically

Sat Sep 21 17:52:27:048 2019


DpWpDynCreate: created new work process W18-13859
DpWpDynCreate: created new work process W20-13860

Sat Sep 21 17:57:29:647 2019


DpWpCheck: dyn W18, pid 13859 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=13860) terminated automatically
DpHdlDeadWp: W18 (pid=13859) terminated automatically
Sat Sep 21 18:02:06:208 2019
DpWpDynCreate: created new work process W19-19756

Sat Sep 21 18:02:11:768 2019


DpWpDynCreate: created new work process W17-20052

Sat Sep 21 18:07:12:475 2019


DpHdlDeadWp: W17 (pid=20052) terminated automatically
DpWpCheck: dyn W19, pid 19756 no longer needed, terminate now

Sat Sep 21 18:07:13:234 2019


DpHdlDeadWp: W19 (pid=19756) terminated automatically

Sat Sep 21 18:07:50:757 2019


DpWpDynCreate: created new work process W16-11119

Sat Sep 21 18:12:12:011 2019


DpWpDynCreate: created new work process W20-15738

Sat Sep 21 18:13:02:554 2019


DpWpCheck: dyn W16, pid 11119 no longer needed, terminate now

Sat Sep 21 18:13:03:379 2019


DpHdlDeadWp: W16 (pid=11119) terminated automatically

Sat Sep 21 18:15:04:474 2019


DpWpDynCreate: created new work process W18-16523

Sat Sep 21 18:17:14:154 2019


DpHdlDeadWp: W20 (pid=15738) terminated automatically

Sat Sep 21 18:20:05:852 2019


DpHdlDeadWp: W18 (pid=16523) terminated automatically

Sat Sep 21 18:20:06:978 2019


DpWpDynCreate: created new work process W17-18104

Sat Sep 21 18:25:22:574 2019


DpWpCheck: dyn W17, pid 18104 no longer needed, terminate now

Sat Sep 21 18:25:23:184 2019


DpHdlDeadWp: W17 (pid=18104) terminated automatically

Sat Sep 21 18:27:23:487 2019


DpWpDynCreate: created new work process W19-20842

Sat Sep 21 18:27:27:846 2019


DpWpDynCreate: created new work process W16-20869

Sat Sep 21 18:32:28:639 2019


DpHdlDeadWp: W16 (pid=20869) terminated automatically
DpWpCheck: dyn W19, pid 20842 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=20842) terminated automatically

Sat Sep 21 18:32:30:890 2019


DpWpDynCreate: created new work process W20-22795

Sat Sep 21 18:33:09:012 2019


DpWpDynCreate: created new work process W18-22958
Sat Sep 21 18:37:32:312 2019
DpHdlDeadWp: W20 (pid=22795) terminated automatically

Sat Sep 21 18:38:04:193 2019


DpWpDynCreate: created new work process W17-24385

Sat Sep 21 18:38:11:364 2019


DpHdlDeadWp: W18 (pid=22958) terminated automatically

Sat Sep 21 18:38:39:858 2019


DpWpDynCreate: created new work process W16-24990

Sat Sep 21 18:42:09:919 2019


DpWpDynCreate: created new work process W19-26143

Sat Sep 21 18:43:06:552 2019


DpHdlDeadWp: W17 (pid=24385) terminated automatically

Sat Sep 21 18:43:42:612 2019


DpWpCheck: dyn W16, pid 24990 no longer needed, terminate now

Sat Sep 21 18:43:43:713 2019


DpHdlDeadWp: W16 (pid=24990) terminated automatically

Sat Sep 21 18:44:05:031 2019


DpWpDynCreate: created new work process W20-26676

Sat Sep 21 18:47:11:361 2019


DpHdlDeadWp: W19 (pid=26143) terminated automatically

Sat Sep 21 18:47:20:607 2019


DpWpDynCreate: created new work process W18-27629

Sat Sep 21 18:49:09:555 2019


DpHdlDeadWp: W20 (pid=26676) terminated automatically

Sat Sep 21 18:51:27:928 2019


DpWpDynCreate: created new work process W17-29004

Sat Sep 21 18:52:22:624 2019


DpWpCheck: dyn W18, pid 27629 no longer needed, terminate now

Sat Sep 21 18:52:23:180 2019


DpHdlDeadWp: W18 (pid=27629) terminated automatically

Sat Sep 21 18:53:45:539 2019


DpWpDynCreate: created new work process W16-29785

Sat Sep 21 18:56:30:429 2019


DpHdlDeadWp: W17 (pid=29004) terminated automatically

Sat Sep 21 18:58:29:436 2019


DpWpDynCreate: created new work process W19-31135

Sat Sep 21 18:59:02:633 2019


DpWpCheck: dyn W16, pid 29785 no longer needed, terminate now

Sat Sep 21 18:59:03:495 2019


DpHdlDeadWp: W16 (pid=29785) terminated automatically

Sat Sep 21 19:01:24:644 2019


DpWpDynCreate: created new work process W20-32270

Sat Sep 21 19:03:32:788 2019


DpHdlDeadWp: W19 (pid=31135) terminated automatically

Sat Sep 21 19:05:05:009 2019


DpWpDynCreate: created new work process W18-1124

Sat Sep 21 19:06:25:514 2019


DpHdlDeadWp: W20 (pid=32270) terminated automatically

Sat Sep 21 19:08:25:041 2019


DpWpDynCreate: created new work process W17-15844

Sat Sep 21 19:10:22:650 2019


DpWpCheck: dyn W18, pid 1124 no longer needed, terminate now

Sat Sep 21 19:10:23:172 2019


DpHdlDeadWp: W18 (pid=1124) terminated automatically

Sat Sep 21 19:13:09:046 2019


DpWpDynCreate: created new work process W16-29192

Sat Sep 21 19:13:20:739 2019


DpWpDynCreate: created new work process W19-30022

Sat Sep 21 19:13:27:626 2019


DpHdlDeadWp: W17 (pid=15844) terminated automatically

Sat Sep 21 19:15:06:816 2019


DpWpDynCreate: created new work process W20-32000

Sat Sep 21 19:18:22:666 2019


DpWpCheck: dyn W16, pid 29192 no longer needed, terminate now
DpWpCheck: dyn W19, pid 30022 no longer needed, terminate now

Sat Sep 21 19:18:24:526 2019


DpHdlDeadWp: W16 (pid=29192) terminated automatically
DpHdlDeadWp: W19 (pid=30022) terminated automatically

Sat Sep 21 19:19:26:936 2019


DpWpDynCreate: created new work process W18-1174

Sat Sep 21 19:20:22:669 2019


DpWpCheck: dyn W20, pid 32000 no longer needed, terminate now

Sat Sep 21 19:20:23:733 2019


DpHdlDeadWp: W20 (pid=32000) terminated automatically

Sat Sep 21 19:24:28:112 2019


DpHdlDeadWp: W18 (pid=1174) terminated automatically

Sat Sep 21 19:24:28:921 2019


DpWpDynCreate: created new work process W17-2884

Sat Sep 21 19:26:12:817 2019


DpWpDynCreate: created new work process W16-3626

Sat Sep 21 19:29:32:219 2019


DpHdlDeadWp: W17 (pid=2884) terminated automatically

Sat Sep 21 19:30:04:251 2019


DpWpDynCreate: created new work process W19-4768

Sat Sep 21 19:31:22:687 2019


DpWpCheck: dyn W16, pid 3626 no longer needed, terminate now

Sat Sep 21 19:31:23:331 2019


DpHdlDeadWp: W16 (pid=3626) terminated automatically

Sat Sep 21 19:31:24:068 2019


DpWpDynCreate: created new work process W20-5225

Sat Sep 21 19:31:25:091 2019


DpWpDynCreate: created new work process W18-5232

Sat Sep 21 19:35:09:766 2019


DpHdlDeadWp: W19 (pid=4768) terminated automatically

Sat Sep 21 19:36:29:663 2019


DpHdlDeadWp: W18 (pid=5232) terminated automatically
DpWpCheck: dyn W20, pid 5225 no longer needed, terminate now

Sat Sep 21 19:36:30:673 2019


DpHdlDeadWp: W20 (pid=5225) terminated automatically

Sat Sep 21 19:40:18:441 2019


DpWpDynCreate: created new work process W17-8202

Sat Sep 21 19:40:29:892 2019


DpWpDynCreate: created new work process W16-8415
DpWpDynCreate: created new work process W19-8416

Sat Sep 21 19:45:20:025 2019


DpHdlDeadWp: W17 (pid=8202) terminated automatically

Sat Sep 21 19:45:22:166 2019


DpWpDynCreate: created new work process W18-9918

Sat Sep 21 19:45:33:934 2019


DpHdlDeadWp: W16 (pid=8415) terminated automatically
DpWpCheck: dyn W19, pid 8416 no longer needed, terminate now

Sat Sep 21 19:45:35:027 2019


DpHdlDeadWp: W19 (pid=8416) terminated automatically

Sat Sep 21 19:46:09:545 2019


DpWpDynCreate: created new work process W20-10138

Sat Sep 21 19:50:23:567 2019


DpHdlDeadWp: W18 (pid=9918) terminated automatically

Sat Sep 21 19:50:24:278 2019


DpWpDynCreate: created new work process W17-11415
Sat Sep 21 19:51:22:729 2019
DpWpCheck: dyn W20, pid 10138 no longer needed, terminate now

Sat Sep 21 19:51:23:368 2019


DpHdlDeadWp: W20 (pid=10138) terminated automatically

Sat Sep 21 19:52:40:429 2019


DpWpDynCreate: created new work process W16-12132

Sat Sep 21 19:55:24:337 2019


DpWpDynCreate: created new work process W19-13091

Sat Sep 21 19:55:33:063 2019


DpHdlDeadWp: W17 (pid=11415) terminated automatically

Sat Sep 21 19:57:42:739 2019


DpWpCheck: dyn W16, pid 12132 no longer needed, terminate now

Sat Sep 21 19:57:43:730 2019


DpHdlDeadWp: W16 (pid=12132) terminated automatically

Sat Sep 21 19:58:08:580 2019


DpHdlDeadWp: W9 (pid=21181) terminated automatically
DpWpDynCreate: created new work process W9-13876

Sat Sep 21 20:00:06:236 2019


DpWpDynCreate: created new work process W18-14426

Sat Sep 21 20:00:25:979 2019


DpHdlDeadWp: W19 (pid=13091) terminated automatically

Sat Sep 21 20:00:38:927 2019


DpWpDynCreate: created new work process W20-14769

Sat Sep 21 20:00:48:418 2019


DpWpDynCreate: created new work process W17-14809

Sat Sep 21 20:04:05:108 2019


DpWpDynCreate: created new work process W16-18449

Sat Sep 21 20:04:12:975 2019


DpWpDynCreate: created new work process W19-18559

Sat Sep 21 20:05:22:753 2019


DpWpCheck: dyn W18, pid 14426 no longer needed, terminate now

Sat Sep 21 20:05:23:321 2019


DpHdlDeadWp: W18 (pid=14426) terminated automatically

Sat Sep 21 20:05:45:830 2019


DpHdlDeadWp: W20 (pid=14769) terminated automatically

Sat Sep 21 20:05:50:689 2019


DpHdlDeadWp: W17 (pid=14809) terminated automatically

Sat Sep 21 20:08:44:012 2019


DpWpDynCreate: created new work process W18-29580

Sat Sep 21 20:09:08:271 2019


DpHdlDeadWp: W16 (pid=18449) terminated automatically

Sat Sep 21 20:09:22:762 2019


DpWpCheck: dyn W19, pid 18559 no longer needed, terminate now

Sat Sep 21 20:09:23:382 2019


DpHdlDeadWp: W19 (pid=18559) terminated automatically

Sat Sep 21 20:10:36:843 2019


DpWpDynCreate: created new work process W20-338

Sat Sep 21 20:14:00:600 2019


DpHdlDeadWp: W18 (pid=29580) terminated automatically

Sat Sep 21 20:14:13:196 2019


DpWpDynCreate: created new work process W17-9973

Sat Sep 21 20:15:42:769 2019


DpWpCheck: dyn W20, pid 338 no longer needed, terminate now

Sat Sep 21 20:15:43:764 2019


DpHdlDeadWp: W20 (pid=338) terminated automatically

Sat Sep 21 20:16:08:566 2019


DpWpDynCreate: created new work process W16-15230

Sat Sep 21 20:19:20:055 2019


DpHdlDeadWp: W17 (pid=9973) terminated automatically

Sat Sep 21 20:21:09:584 2019


DpWpDynCreate: created new work process W19-16740

Sat Sep 21 20:21:10:093 2019


DpHdlDeadWp: W16 (pid=15230) terminated automatically

Sat Sep 21 20:25:04:491 2019


DpWpDynCreate: created new work process W18-18022

Sat Sep 21 20:26:15:551 2019


DpHdlDeadWp: W19 (pid=16740) terminated automatically
DpWpDynCreate: created new work process W20-18499

Sat Sep 21 20:30:06:236 2019


DpHdlDeadWp: W18 (pid=18022) terminated automatically

Sat Sep 21 20:31:13:278 2019


DpWpDynCreate: created new work process W17-20251
DpWpDynCreate: created new work process W16-20252

Sat Sep 21 20:31:16:393 2019


DpHdlDeadWp: W20 (pid=18499) terminated automatically

Sat Sep 21 20:34:24:271 2019


DpWpDynCreate: created new work process W19-21524

Sat Sep 21 20:36:15:631 2019


DpHdlDeadWp: W16 (pid=20252) terminated automatically
DpWpCheck: dyn W17, pid 20251 no longer needed, terminate now
Sat Sep 21 20:36:16:330 2019
DpHdlDeadWp: W17 (pid=20251) terminated automatically

Sat Sep 21 20:36:26:851 2019


DpWpDynCreate: created new work process W18-22162

Sat Sep 21 20:36:30:257 2019


DpWpDynCreate: created new work process W20-22240

Sat Sep 21 20:37:06:180 2019


DpWpDynCreate: created new work process W16-22414

Sat Sep 21 20:39:42:807 2019


DpWpCheck: dyn W19, pid 21524 no longer needed, terminate now

Sat Sep 21 20:39:43:868 2019


DpHdlDeadWp: W19 (pid=21524) terminated automatically

Sat Sep 21 20:41:27:216 2019


DpHdlDeadWp: W18 (pid=22162) terminated automatically

Sat Sep 21 20:41:42:809 2019


DpWpCheck: dyn W20, pid 22240 no longer needed, terminate now

Sat Sep 21 20:41:43:031 2019


DpHdlDeadWp: W20 (pid=22240) terminated automatically

Sat Sep 21 20:42:16:329 2019


DpHdlDeadWp: W16 (pid=22414) terminated automatically

Sat Sep 21 20:46:12:727 2019


DpWpDynCreate: created new work process W17-25712

Sat Sep 21 20:47:08:752 2019


DpWpDynCreate: created new work process W19-25986

Sat Sep 21 20:51:14:060 2019


DpHdlDeadWp: W17 (pid=25712) terminated automatically

Sat Sep 21 20:52:11:985 2019


DpHdlDeadWp: W19 (pid=25986) terminated automatically

Sat Sep 21 20:53:08:605 2019


DpWpDynCreate: created new work process W18-27878

Sat Sep 21 20:54:05:303 2019


DpWpDynCreate: created new work process W20-28088

Sat Sep 21 20:56:09:902 2019


DpHdlDeadWp: W12 (pid=21660) terminated automatically
DpWpDynCreate: created new work process W12-28810

Sat Sep 21 20:58:22:835 2019


DpWpCheck: dyn W18, pid 27878 no longer needed, terminate now

Sat Sep 21 20:58:23:141 2019


DpHdlDeadWp: W18 (pid=27878) terminated automatically

Sat Sep 21 20:59:06:395 2019


DpHdlDeadWp: W20 (pid=28088) terminated automatically

Sat Sep 21 20:59:25:457 2019


DpWpDynCreate: created new work process W16-29761

Sat Sep 21 21:01:14:700 2019


DpWpDynCreate: created new work process W17-30696

Sat Sep 21 21:04:28:659 2019


DpHdlDeadWp: W16 (pid=29761) terminated automatically

Sat Sep 21 21:06:22:852 2019


DpWpCheck: dyn W17, pid 30696 no longer needed, terminate now

Sat Sep 21 21:06:24:019 2019


DpHdlDeadWp: W17 (pid=30696) terminated automatically

Sat Sep 21 21:07:18:456 2019


DpWpDynCreate: created new work process W19-19597

Sat Sep 21 21:07:19:695 2019


DpWpDynCreate: created new work process W18-19689

Sat Sep 21 21:07:28:810 2019


DpWpDynCreate: created new work process W20-20461

Sat Sep 21 21:12:19:346 2019


DpHdlDeadWp: W19 (pid=19597) terminated automatically

Sat Sep 21 21:12:20:372 2019


DpHdlDeadWp: W18 (pid=19689) terminated automatically

Sat Sep 21 21:12:20:631 2019


DpWpDynCreate: created new work process W16-29569

Sat Sep 21 21:12:29:449 2019


DpHdlDeadWp: W20 (pid=20461) terminated automatically

Sat Sep 21 21:14:11:691 2019


DpWpDynCreate: created new work process W17-30177

Sat Sep 21 21:17:22:588 2019


DpHdlDeadWp: W16 (pid=29569) terminated automatically

Sat Sep 21 21:19:12:679 2019


DpHdlDeadWp: W17 (pid=30177) terminated automatically

Sat Sep 21 21:22:11:260 2019


DpWpDynCreate: created new work process W19-630

Sat Sep 21 21:22:12:607 2019


DpWpDynCreate: created new work process W18-639

Sat Sep 21 21:22:22:545 2019


DpWpDynCreate: created new work process W20-685

Sat Sep 21 21:27:14:083 2019


DpHdlDeadWp: W18 (pid=639) terminated automatically
Sat Sep 21 21:27:14:694 2019
DpHdlDeadWp: W19 (pid=630) terminated automatically

Sat Sep 21 21:27:18:950 2019


DpWpDynCreate: created new work process W16-2245

Sat Sep 21 21:27:19:092 2019


DpWpDynCreate: created new work process W17-2246

Sat Sep 21 21:27:25:393 2019


DpHdlDeadWp: W20 (pid=685) terminated automatically

Sat Sep 21 21:31:05:511 2019


DpWpDynCreate: created new work process W18-3734

Sat Sep 21 21:32:14:989 2019


DpWpDynCreate: created new work process W19-4163

Sat Sep 21 21:32:22:892 2019


DpWpCheck: dyn W16, pid 2245 no longer needed, terminate now
DpWpCheck: dyn W17, pid 2246 no longer needed, terminate now

Sat Sep 21 21:32:23:398 2019


DpHdlDeadWp: W16 (pid=2245) terminated automatically
DpHdlDeadWp: W17 (pid=2246) terminated automatically

Sat Sep 21 21:33:06:801 2019


DpWpDynCreate: created new work process W20-4371

Sat Sep 21 21:36:06:905 2019


DpHdlDeadWp: W18 (pid=3734) terminated automatically

Sat Sep 21 21:36:13:405 2019


DpWpDynCreate: created new work process W16-5429

Sat Sep 21 21:36:48:204 2019


DpWpDynCreate: created new work process W17-5627

Sat Sep 21 21:37:22:901 2019


DpWpCheck: dyn W19, pid 4163 no longer needed, terminate now

Sat Sep 21 21:37:23:990 2019


DpHdlDeadWp: W19 (pid=4163) terminated automatically

Sat Sep 21 21:38:22:903 2019


DpWpCheck: dyn W20, pid 4371 no longer needed, terminate now

Sat Sep 21 21:38:23:152 2019


DpHdlDeadWp: W20 (pid=4371) terminated automatically

Sat Sep 21 21:41:22:910 2019


DpWpCheck: dyn W16, pid 5429 no longer needed, terminate now

Sat Sep 21 21:41:23:369 2019


DpHdlDeadWp: W16 (pid=5429) terminated automatically

Sat Sep 21 21:41:50:456 2019


DpHdlDeadWp: W17 (pid=5627) terminated automatically
Sat Sep 21 21:42:11:736 2019
DpWpDynCreate: created new work process W18-7489

Sat Sep 21 21:42:12:033 2019


DpWpDynCreate: created new work process W19-7492

Sat Sep 21 21:46:06:376 2019


DpWpDynCreate: created new work process W20-8606

Sat Sep 21 21:47:11:320 2019


DpWpDynCreate: created new work process W16-8998

Sat Sep 21 21:47:12:170 2019


DpHdlDeadWp: W18 (pid=7489) terminated automatically

Sat Sep 21 21:47:13:166 2019


DpWpDynCreate: created new work process W17-9011
DpHdlDeadWp: W19 (pid=7492) terminated automatically
DpWpDynCreate: created new work process W19-9012

Sat Sep 21 21:51:17:398 2019


DpHdlDeadWp: W20 (pid=8606) terminated automatically

Sat Sep 21 21:52:13:210 2019


DpWpDynCreate: created new work process W18-10731

Sat Sep 21 21:52:14:504 2019


DpWpCheck: dyn W16, pid 8998 no longer needed, terminate now
DpHdlDeadWp: W19 (pid=9012) terminated automatically

Sat Sep 21 21:52:15:510 2019


DpHdlDeadWp: W16 (pid=8998) terminated automatically
DpHdlDeadWp: W17 (pid=9011) terminated automatically

Sat Sep 21 21:52:16:638 2019


DpWpDynCreate: created new work process W20-10736

Sat Sep 21 21:52:25:958 2019


DpWpDynCreate: created new work process W19-10743

Sat Sep 21 21:57:16:769 2019


DpHdlDeadWp: W18 (pid=10731) terminated automatically

Sat Sep 21 21:57:22:207 2019


DpHdlDeadWp: W20 (pid=10736) terminated automatically

Sat Sep 21 21:57:30:985 2019


DpHdlDeadWp: W19 (pid=10743) terminated automatically

Sat Sep 21 21:59:21:235 2019


DpWpDynCreate: created new work process W16-12972

Sat Sep 21 22:04:22:958 2019


DpWpCheck: dyn W16, pid 12972 no longer needed, terminate now

Sat Sep 21 22:04:23:604 2019


DpHdlDeadWp: W16 (pid=12972) terminated automatically

Sat Sep 21 22:06:14:225 2019


DpWpDynCreate: created new work process W17-31871

Sat Sep 21 22:07:15:866 2019


DpWpDynCreate: created new work process W18-4986

Sat Sep 21 22:11:22:970 2019


DpWpCheck: dyn W17, pid 31871 no longer needed, terminate now

Sat Sep 21 22:11:23:619 2019


DpHdlDeadWp: W17 (pid=31871) terminated automatically

Sat Sep 21 22:12:17:064 2019


DpHdlDeadWp: W18 (pid=4986) terminated automatically

Sat Sep 21 22:12:27:867 2019


DpWpDynCreate: created new work process W20-12686

Sat Sep 21 22:17:42:979 2019


DpWpCheck: dyn W20, pid 12686 no longer needed, terminate now

Sat Sep 21 22:17:43:296 2019


DpHdlDeadWp: W20 (pid=12686) terminated automatically

Sat Sep 21 22:22:02:072 2019


DpWpDynCreate: created new work process W19-15809

Sat Sep 21 22:23:14:291 2019


DpWpDynCreate: created new work process W16-16199

Sat Sep 21 22:27:07:549 2019


DpHdlDeadWp: W19 (pid=15809) terminated automatically

Sat Sep 21 22:28:14:712 2019


DpWpDynCreate: created new work process W17-17762

Sat Sep 21 22:28:15:680 2019


DpHdlDeadWp: W16 (pid=16199) terminated automatically

Sat Sep 21 22:28:17:625 2019


DpWpDynCreate: created new work process W18-17780

Sat Sep 21 22:33:13:287 2019


DpWpDynCreate: created new work process W20-19426

Sat Sep 21 22:33:19:987 2019


DpWpCheck: dyn W17, pid 17762 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=17780) terminated automatically

Sat Sep 21 22:33:20:563 2019


DpHdlDeadWp: W17 (pid=17762) terminated automatically

Sat Sep 21 22:35:05:496 2019


DpWpDynCreate: created new work process W19-19923

Sat Sep 21 22:35:07:398 2019


DpWpDynCreate: created new work process W16-19940

Sat Sep 21 22:38:15:290 2019


DpHdlDeadWp: W20 (pid=19426) terminated automatically
Sat Sep 21 22:40:06:297 2019
DpHdlDeadWp: W19 (pid=19923) terminated automatically

Sat Sep 21 22:40:11:496 2019


DpHdlDeadWp: W16 (pid=19940) terminated automatically

Sat Sep 21 22:40:12:100 2019


DpWpDynCreate: created new work process W18-21926

Sat Sep 21 22:42:11:748 2019


DpWpDynCreate: created new work process W17-22517

Sat Sep 21 22:43:20:789 2019


DpWpDynCreate: created new work process W20-22881

Sat Sep 21 22:45:14:883 2019


DpHdlDeadWp: W18 (pid=21926) terminated automatically

Sat Sep 21 22:47:12:583 2019


DpHdlDeadWp: W17 (pid=22517) terminated automatically

Sat Sep 21 22:47:23:362 2019


DpWpDynCreate: created new work process W19-24361

Sat Sep 21 22:48:14:951 2019


DpWpDynCreate: created new work process W16-24657

Sat Sep 21 22:48:23:030 2019


DpWpCheck: dyn W20, pid 22881 no longer needed, terminate now

Sat Sep 21 22:48:23:193 2019


DpHdlDeadWp: W20 (pid=22881) terminated automatically

Sat Sep 21 22:50:15:535 2019


DpWpDynCreate: created new work process W18-25404

Sat Sep 21 22:52:27:355 2019


DpHdlDeadWp: W19 (pid=24361) terminated automatically

Sat Sep 21 22:53:23:040 2019


DpWpCheck: dyn W16, pid 24657 no longer needed, terminate now

Sat Sep 21 22:53:24:950 2019


DpHdlDeadWp: W16 (pid=24657) terminated automatically

Sat Sep 21 22:54:12:277 2019


DpWpDynCreate: created new work process W17-26536

Sat Sep 21 22:54:13:193 2019


DpWpDynCreate: created new work process W20-26541

Sat Sep 21 22:55:17:455 2019


DpHdlDeadWp: W18 (pid=25404) terminated automatically

Sat Sep 21 22:59:14:617 2019


DpHdlDeadWp: W20 (pid=26541) terminated automatically

Sat Sep 21 22:59:22:367 2019


DpHdlDeadWp: W17 (pid=26536) terminated automatically

Sat Sep 21 23:01:12:479 2019


DpWpDynCreate: created new work process W19-29093

Sat Sep 21 23:04:17:814 2019


DpWpDynCreate: created new work process W16-8713

Sat Sep 21 23:06:13:684 2019


DpHdlDeadWp: W19 (pid=29093) terminated automatically

Sat Sep 21 23:07:05:452 2019


DpWpDynCreate: created new work process W18-15718

Sat Sep 21 23:09:23:067 2019


DpWpCheck: dyn W16, pid 8713 no longer needed, terminate now

Sat Sep 21 23:09:24:166 2019


DpHdlDeadWp: W16 (pid=8713) terminated automatically

Sat Sep 21 23:10:14:554 2019


DpWpDynCreate: created new work process W20-27432

Sat Sep 21 23:12:06:753 2019


DpHdlDeadWp: W18 (pid=15718) terminated automatically

Sat Sep 21 23:14:08:456 2019


DpWpDynCreate: created new work process W17-28576

Sat Sep 21 23:15:11:594 2019


DpWpDynCreate: created new work process W19-28997

Sat Sep 21 23:15:17:973 2019


DpHdlDeadWp: W20 (pid=27432) terminated automatically

Sat Sep 21 23:19:23:083 2019


DpWpCheck: dyn W17, pid 28576 no longer needed, terminate now

Sat Sep 21 23:19:24:178 2019


DpHdlDeadWp: W17 (pid=28576) terminated automatically

Sat Sep 21 23:20:16:219 2019


DpHdlDeadWp: W19 (pid=28997) terminated automatically

Sat Sep 21 23:20:20:426 2019


DpWpDynCreate: created new work process W16-30653

Sat Sep 21 23:22:11:385 2019


DpWpDynCreate: created new work process W18-31238

Sat Sep 21 23:25:21:526 2019


DpHdlDeadWp: W16 (pid=30653) terminated automatically

Sat Sep 21 23:25:22:440 2019


DpWpDynCreate: created new work process W20-32400

Sat Sep 21 23:27:12:628 2019


DpHdlDeadWp: W18 (pid=31238) terminated automatically
Sat Sep 21 23:29:07:373 2019
DpWpDynCreate: created new work process W17-1028

Sat Sep 21 23:30:24:749 2019


DpHdlDeadWp: W20 (pid=32400) terminated automatically

Sat Sep 21 23:31:23:403 2019


DpWpDynCreate: created new work process W19-1858

Sat Sep 21 23:34:23:113 2019


DpWpCheck: dyn W17, pid 1028 no longer needed, terminate now

Sat Sep 21 23:34:23:780 2019


DpHdlDeadWp: W17 (pid=1028) terminated automatically

Sat Sep 21 23:35:10:846 2019


DpWpDynCreate: created new work process W16-3220

Sat Sep 21 23:35:13:536 2019


DpWpDynCreate: created new work process W18-3241

Sat Sep 21 23:36:25:118 2019


DpHdlDeadWp: W19 (pid=1858) terminated automatically

Sat Sep 21 23:40:12:010 2019


DpHdlDeadWp: W16 (pid=3220) terminated automatically

Sat Sep 21 23:40:23:218 2019


DpHdlDeadWp: W18 (pid=3241) terminated automatically

Sat Sep 21 23:40:24:672 2019


DpWpDynCreate: created new work process W20-5098

Sat Sep 21 23:44:08:396 2019


DpWpDynCreate: created new work process W17-6559

Sat Sep 21 23:45:43:132 2019


DpWpCheck: dyn W20, pid 5098 no longer needed, terminate now

Sat Sep 21 23:45:43:379 2019


DpHdlDeadWp: W20 (pid=5098) terminated automatically

Sat Sep 21 23:46:12:727 2019


DpWpDynCreate: created new work process W19-7201

Sat Sep 21 23:49:10:643 2019


DpHdlDeadWp: W17 (pid=6559) terminated automatically

Sat Sep 21 23:51:19:833 2019


DpHdlDeadWp: W19 (pid=7201) terminated automatically

Sat Sep 21 23:52:12:414 2019


DpWpDynCreate: created new work process W16-9169

Sat Sep 21 23:52:28:441 2019


DpWpDynCreate: created new work process W18-9197
DpWpDynCreate: created new work process W20-9198

Sat Sep 21 23:57:14:083 2019


DpHdlDeadWp: W16 (pid=9169) terminated automatically

Sat Sep 21 23:57:35:116 2019


DpHdlDeadWp: W18 (pid=9197) terminated automatically
DpWpCheck: dyn W20, pid 9198 no longer needed, terminate now

Sat Sep 21 23:57:35:433 2019


DpHdlDeadWp: W20 (pid=9198) terminated automatically

Sat Sep 21 23:59:09:881 2019


DpWpDynCreate: created new work process W17-11223

Sun Sep 22 00:00:03:712 2019


***LOG Q1O=> DpWpConf, WP Conf () [dpxxwp.c 2947]
DpWpConf: change wps: DIA 9->8, VB 1->1, ENQ 0->0, BTC 5->5, SPO 1->1 VB2 1->1
STANDBY 0->0 DYN 5->5
DpWpConf: wp reconfiguration, stop W17, pid 11223
DpAdaptWppriv_max_no : 4 -> 4

Sun Sep 22 00:00:03:828 2019


DpHdlDeadWp: W17 (pid=11223) terminated automatically

Sun Sep 22 00:03:29:904 2019


DpWpDynCreate: created new work process W19-16467

Sun Sep 22 00:08:08:413 2019


DpWpDynCreate: created new work process W16-30707

Sun Sep 22 00:08:16:363 2019


DpWpDynCreate: created new work process W18-30827

Sun Sep 22 00:08:16:529 2019


DpWpDynCreate: created new work process W20-30828

Sun Sep 22 00:08:43:173 2019


DpWpCheck: dyn W19, pid 16467 no longer needed, terminate now

Sun Sep 22 00:08:43:812 2019


DpHdlDeadWp: W19 (pid=16467) terminated automatically

Sun Sep 22 00:13:23:180 2019


DpWpCheck: dyn W16, pid 30707 no longer needed, terminate now
DpWpCheck: dyn W18, pid 30827 no longer needed, terminate now
DpWpCheck: dyn W20, pid 30828 no longer needed, terminate now

Sun Sep 22 00:13:24:151 2019


DpHdlDeadWp: W16 (pid=30707) terminated automatically
DpHdlDeadWp: W18 (pid=30827) terminated automatically
DpHdlDeadWp: W20 (pid=30828) terminated automatically

Sun Sep 22 00:14:07:293 2019


DpWpDynCreate: created new work process W17-13074

Sun Sep 22 00:14:23:106 2019


DpWpDynCreate: created new work process W19-13649

Sun Sep 22 00:14:27:857 2019


DpWpDynCreate: created new work process W16-13745
Sun Sep 22 00:19:11:053 2019
DpHdlDeadWp: W17 (pid=13074) terminated automatically

Sun Sep 22 00:19:43:197 2019


DpWpCheck: dyn W16, pid 13745 no longer needed, terminate now
DpWpCheck: dyn W19, pid 13649 no longer needed, terminate now

Sun Sep 22 00:19:44:162 2019


DpHdlDeadWp: W16 (pid=13745) terminated automatically
DpHdlDeadWp: W19 (pid=13649) terminated automatically

Sun Sep 22 00:20:16:243 2019


DpWpDynCreate: created new work process W18-25242

Sun Sep 22 00:20:19:251 2019


DpWpDynCreate: created new work process W20-25316

Sun Sep 22 00:20:27:068 2019


DpWpDynCreate: created new work process W17-25576

Sun Sep 22 00:25:21:245 2019


DpWpCheck: dyn W18, pid 25242 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=25316) terminated automatically

Sun Sep 22 00:25:21:431 2019


DpHdlDeadWp: W18 (pid=25242) terminated automatically

Sun Sep 22 00:25:28:947 2019


DpHdlDeadWp: W17 (pid=25576) terminated automatically

Sun Sep 22 00:25:39:943 2019


DpWpDynCreate: created new work process W16-3793

Sun Sep 22 00:30:41:538 2019


DpHdlDeadWp: W16 (pid=3793) terminated automatically

Sun Sep 22 00:34:16:062 2019


DpWpDynCreate: created new work process W19-8970

Sun Sep 22 00:39:23:230 2019


DpWpCheck: dyn W19, pid 8970 no longer needed, terminate now

Sun Sep 22 00:39:24:200 2019


DpHdlDeadWp: W19 (pid=8970) terminated automatically

Sun Sep 22 00:39:58:487 2019


DpWpDynCreate: created new work process W20-10875

Sun Sep 22 00:39:59:057 2019


DpWpDynCreate: created new work process W18-10878

Sun Sep 22 00:43:45:164 2019


DpWpDynCreate: created new work process W17-12168

Sun Sep 22 00:45:03:238 2019


DpWpCheck: dyn W18, pid 10878 no longer needed, terminate now

Sun Sep 22 00:45:03:797 2019


DpHdlDeadWp: W18 (pid=10878) terminated automatically
DpHdlDeadWp: W20 (pid=10875) terminated automatically

Sun Sep 22 00:45:10:110 2019


DpWpDynCreate: created new work process W16-12634

Sun Sep 22 00:47:11:657 2019


DpWpDynCreate: created new work process W19-14222

Sun Sep 22 00:49:03:245 2019


DpWpCheck: dyn W17, pid 12168 no longer needed, terminate now

Sun Sep 22 00:49:03:875 2019


DpHdlDeadWp: W17 (pid=12168) terminated automatically

Sun Sep 22 00:50:20:188 2019


DpWpDynCreate: created new work process W18-20127

Sun Sep 22 00:50:21:815 2019


DpWpDynCreate: created new work process W20-20298

Sun Sep 22 00:50:23:248 2019


DpWpCheck: dyn W16, pid 12634 no longer needed, terminate now

Sun Sep 22 00:50:23:721 2019


DpHdlDeadWp: W16 (pid=12634) terminated automatically

Sun Sep 22 00:52:23:251 2019


DpWpCheck: dyn W19, pid 14222 no longer needed, terminate now

Sun Sep 22 00:52:23:522 2019


DpHdlDeadWp: W19 (pid=14222) terminated automatically

Sun Sep 22 00:55:06:829 2019


DpWpDynCreate: created new work process W17-28242

Sun Sep 22 00:55:22:567 2019


DpWpCheck: dyn W18, pid 20127 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=20298) terminated automatically

Sun Sep 22 00:55:22:667 2019


DpHdlDeadWp: W18 (pid=20127) terminated automatically

Sun Sep 22 00:55:24:139 2019


DpWpDynCreate: created new work process W16-28341

Sun Sep 22 00:55:24:670 2019


DpWpDynCreate: created new work process W19-28342

Sun Sep 22 01:00:23:262 2019


DpWpCheck: dyn W17, pid 28242 no longer needed, terminate now

Sun Sep 22 01:00:23:663 2019


DpHdlDeadWp: W17 (pid=28242) terminated automatically

Sun Sep 22 01:00:43:263 2019


DpWpCheck: dyn W16, pid 28341 no longer needed, terminate now
DpWpCheck: dyn W19, pid 28342 no longer needed, terminate now

Sun Sep 22 01:00:43:784 2019


DpHdlDeadWp: W16 (pid=28341) terminated automatically
DpHdlDeadWp: W19 (pid=28342) terminated automatically

Sun Sep 22 01:01:26:117 2019


DpWpDynCreate: created new work process W20-31215

Sun Sep 22 01:04:48:239 2019


DpHdlDeadWp: W9 (pid=13876) terminated automatically
DpWpDynCreate: created new work process W9-9709

Sun Sep 22 01:06:21:519 2019


DpWpDynCreate: created new work process W18-13569

Sun Sep 22 01:06:28:003 2019


DpHdlDeadWp: W20 (pid=31215) terminated automatically

Sun Sep 22 01:06:28:514 2019


DpWpDynCreate: created new work process W17-13605

Sun Sep 22 01:07:34:397 2019


DpWpDynCreate: created new work process W16-16578

Sun Sep 22 01:11:23:277 2019


DpWpCheck: dyn W18, pid 13569 no longer needed, terminate now

Sun Sep 22 01:11:23:657 2019


DpHdlDeadWp: W18 (pid=13569) terminated automatically

Sun Sep 22 01:11:33:901 2019


DpHdlDeadWp: W17 (pid=13605) terminated automatically

Sun Sep 22 01:12:11:938 2019


DpWpDynCreate: created new work process W19-29658

Sun Sep 22 01:12:43:279 2019


DpWpCheck: dyn W16, pid 16578 no longer needed, terminate now

Sun Sep 22 01:12:44:049 2019


DpHdlDeadWp: W16 (pid=16578) terminated automatically

Sun Sep 22 01:13:26:399 2019


DpWpDynCreate: created new work process W20-29941

Sun Sep 22 01:16:07:220 2019


DpWpDynCreate: created new work process W18-30781

Sun Sep 22 01:17:12:241 2019


DpHdlDeadWp: W19 (pid=29658) terminated automatically

Sun Sep 22 01:18:33:438 2019


DpHdlDeadWp: W20 (pid=29941) terminated automatically

Sun Sep 22 01:20:06:928 2019


DpWpDynCreate: created new work process W17-32289

Sun Sep 22 01:21:12:716 2019


DpHdlDeadWp: W18 (pid=30781) terminated automatically

Sun Sep 22 01:21:23:314 2019


DpWpDynCreate: created new work process W16-32760

Sun Sep 22 01:25:06:970 2019


DpWpDynCreate: created new work process W19-1373

Sun Sep 22 01:25:08:340 2019


DpHdlDeadWp: W17 (pid=32289) terminated automatically

Sun Sep 22 01:26:14:300 2019


DpWpDynCreate: created new work process W20-1874

Sun Sep 22 01:26:25:471 2019


DpHdlDeadWp: W16 (pid=32760) terminated automatically

Sun Sep 22 01:30:07:320 2019


DpHdlDeadWp: W19 (pid=1373) terminated automatically

Sun Sep 22 01:30:07:636 2019


DpWpDynCreate: created new work process W18-3501

Sun Sep 22 01:31:08:021 2019


DpWpDynCreate: created new work process W17-3707

Sun Sep 22 01:31:16:617 2019


DpHdlDeadWp: W20 (pid=1874) terminated automatically

Sun Sep 22 01:32:16:857 2019


DpWpDynCreate: created new work process W16-4197

Sun Sep 22 01:35:10:262 2019


DpHdlDeadWp: W18 (pid=3501) terminated automatically

Sun Sep 22 01:36:08:947 2019


DpWpDynCreate: created new work process W19-5413

Sun Sep 22 01:36:09:369 2019


DpHdlDeadWp: W17 (pid=3707) terminated automatically

Sun Sep 22 01:37:23:323 2019


DpWpCheck: dyn W16, pid 4197 no longer needed, terminate now

Sun Sep 22 01:37:24:395 2019


DpHdlDeadWp: W16 (pid=4197) terminated automatically

Sun Sep 22 01:40:43:366 2019


DpHdlDeadWp: W10 (pid=28398) terminated automatically
DpWpDynCreate: created new work process W10-7133

Sun Sep 22 01:40:44:210 2019


DpWpDynCreate: created new work process W20-7138

Sun Sep 22 01:41:10:329 2019


DpHdlDeadWp: W19 (pid=5413) terminated automatically

Sun Sep 22 01:42:12:072 2019


DpWpDynCreate: created new work process W18-7543

Sun Sep 22 01:42:13:985 2019


DpWpDynCreate: created new work process W17-7572
Sun Sep 22 01:46:03:912 2019
DpHdlDeadWp: W20 (pid=7138) terminated automatically

Sun Sep 22 01:46:18:431 2019


DpWpDynCreate: created new work process W16-8861

Sun Sep 22 01:47:14:480 2019


DpWpCheck: dyn W17, pid 7572 no longer needed, terminate now
DpHdlDeadWp: W18 (pid=7543) terminated automatically

Sun Sep 22 01:47:15:545 2019


DpHdlDeadWp: W17 (pid=7572) terminated automatically

Sun Sep 22 01:49:08:279 2019


DpWpDynCreate: created new work process W19-9919

Sun Sep 22 01:49:08:983 2019


DpWpDynCreate: created new work process W20-9942

Sun Sep 22 01:51:20:288 2019


DpHdlDeadWp: W16 (pid=8861) terminated automatically

Sun Sep 22 01:54:10:219 2019


DpHdlDeadWp: W19 (pid=9919) terminated automatically
DpHdlDeadWp: W20 (pid=9942) terminated automatically

Sun Sep 22 01:56:40:539 2019


DpWpDynCreate: created new work process W18-12250

Sun Sep 22 01:56:50:901 2019


DpWpDynCreate: created new work process W17-12258

Sun Sep 22 02:01:43:401 2019


DpWpCheck: dyn W18, pid 12250 no longer needed, terminate now

Sun Sep 22 02:01:44:793 2019


DpHdlDeadWp: W18 (pid=12250) terminated automatically

Sun Sep 22 02:02:03:896 2019


DpHdlDeadWp: W17 (pid=12258) terminated automatically

Sun Sep 22 02:02:12:074 2019


DpWpDynCreate: created new work process W16-14909

Sun Sep 22 02:02:42:310 2019


DpWpDynCreate: created new work process W19-15539

Sun Sep 22 02:07:14:152 2019


DpHdlDeadWp: W16 (pid=14909) terminated automatically

Sun Sep 22 02:07:28:595 2019


DpWpDynCreate: created new work process W20-20607

Sun Sep 22 02:07:43:414 2019


DpWpCheck: dyn W19, pid 15539 no longer needed, terminate now

Sun Sep 22 02:07:44:318 2019


DpHdlDeadWp: W19 (pid=15539) terminated automatically
Sun Sep 22 02:12:06:308 2019
DpWpDynCreate: created new work process W18-31046

Sun Sep 22 02:12:43:422 2019


DpWpCheck: dyn W20, pid 20607 no longer needed, terminate now

Sun Sep 22 02:12:43:672 2019


DpHdlDeadWp: W20 (pid=20607) terminated automatically

Sun Sep 22 02:13:06:879 2019


DpWpDynCreate: created new work process W17-32351

Sun Sep 22 02:17:12:864 2019


DpHdlDeadWp: W18 (pid=31046) terminated automatically

Sun Sep 22 02:18:23:433 2019


DpWpCheck: dyn W17, pid 32351 no longer needed, terminate now

Sun Sep 22 02:18:24:003 2019


DpHdlDeadWp: W17 (pid=32351) terminated automatically

Sun Sep 22 02:19:07:221 2019


DpWpDynCreate: created new work process W16-14798

Sun Sep 22 02:24:08:454 2019


DpHdlDeadWp: W16 (pid=14798) terminated automatically
DpWpDynCreate: created new work process W16-16338

Sun Sep 22 02:26:09:583 2019


DpWpDynCreate: created new work process W19-17110

Sun Sep 22 02:29:09:760 2019


DpHdlDeadWp: W16 (pid=16338) terminated automatically

Sun Sep 22 02:29:14:477 2019


DpWpDynCreate: created new work process W20-18127

Sun Sep 22 02:31:23:452 2019


DpWpCheck: dyn W19, pid 17110 no longer needed, terminate now

Sun Sep 22 02:31:23:560 2019


DpHdlDeadWp: W19 (pid=17110) terminated automatically

Sun Sep 22 02:32:29:036 2019


DpWpDynCreate: created new work process W18-19043

Sun Sep 22 02:32:29:888 2019


DpWpDynCreate: created new work process W17-19144

Sun Sep 22 02:34:16:100 2019


DpHdlDeadWp: W20 (pid=18127) terminated automatically

Sun Sep 22 02:37:30:959 2019


DpHdlDeadWp: W17 (pid=19144) terminated automatically
DpWpCheck: dyn W18, pid 19043 no longer needed, terminate now

Sun Sep 22 02:37:31:212 2019


DpHdlDeadWp: W18 (pid=19043) terminated automatically
Sun Sep 22 02:39:12:565 2019
DpWpDynCreate: created new work process W16-21533

Sun Sep 22 02:41:12:380 2019


DpWpDynCreate: created new work process W19-22203

Sun Sep 22 02:41:27:187 2019


DpWpDynCreate: created new work process W20-22214

Sun Sep 22 02:44:23:473 2019


DpWpCheck: dyn W16, pid 21533 no longer needed, terminate now

Sun Sep 22 02:44:23:942 2019


DpHdlDeadWp: W16 (pid=21533) terminated automatically

Sun Sep 22 02:46:23:477 2019


DpWpCheck: dyn W19, pid 22203 no longer needed, terminate now

Sun Sep 22 02:46:23:735 2019


DpHdlDeadWp: W19 (pid=22203) terminated automatically

Sun Sep 22 02:46:43:478 2019


DpWpCheck: dyn W20, pid 22214 no longer needed, terminate now

Sun Sep 22 02:46:43:760 2019


DpHdlDeadWp: W20 (pid=22214) terminated automatically

Sun Sep 22 02:47:32:599 2019


DpWpDynCreate: created new work process W17-24651

Sun Sep 22 02:47:32:980 2019


DpWpDynCreate: created new work process W18-24652

Sun Sep 22 02:52:43:487 2019


DpWpCheck: dyn W17, pid 24651 no longer needed, terminate now
DpWpCheck: dyn W18, pid 24652 no longer needed, terminate now

Sun Sep 22 02:52:44:260 2019


DpHdlDeadWp: W17 (pid=24651) terminated automatically
DpHdlDeadWp: W18 (pid=24652) terminated automatically

Sun Sep 22 02:56:09:265 2019


DpWpDynCreate: created new work process W16-27384

Sun Sep 22 03:00:31:028 2019


DpWpDynCreate: created new work process W19-28865

Sun Sep 22 03:01:23:500 2019


DpWpCheck: dyn W16, pid 27384 no longer needed, terminate now

Sun Sep 22 03:01:23:742 2019


DpHdlDeadWp: W16 (pid=27384) terminated automatically

Sun Sep 22 03:05:33:250 2019


DpHdlDeadWp: W19 (pid=28865) terminated automatically

Sun Sep 22 03:09:05:642 2019


DpWpDynCreate: created new work process W20-26896
Sun Sep 22 03:14:06:688 2019
DpHdlDeadWp: W20 (pid=26896) terminated automatically

Sun Sep 22 03:16:27:888 2019


DpWpDynCreate: created new work process W17-29433

Sun Sep 22 03:21:29:176 2019


DpHdlDeadWp: W17 (pid=29433) terminated automatically

Sun Sep 22 03:22:35:017 2019


DpWpDynCreate: created new work process W18-31606

Sun Sep 22 03:25:25:950 2019


DpWpDynCreate: created new work process W16-32448

Sun Sep 22 03:27:43:541 2019


DpWpCheck: dyn W18, pid 31606 no longer needed, terminate now

Sun Sep 22 03:27:44:612 2019


DpHdlDeadWp: W18 (pid=31606) terminated automatically

Sun Sep 22 03:30:08:115 2019


DpWpDynCreate: created new work process W19-2347

Sun Sep 22 03:30:32:815 2019


DpHdlDeadWp: W16 (pid=32448) terminated automatically

Sun Sep 22 03:35:23:552 2019


DpWpCheck: dyn W19, pid 2347 no longer needed, terminate now

Sun Sep 22 03:35:24:055 2019


DpHdlDeadWp: W19 (pid=2347) terminated automatically

Sun Sep 22 03:38:15:946 2019


DpWpDynCreate: created new work process W20-5219

Sun Sep 22 03:43:19:245 2019


DpHdlDeadWp: W20 (pid=5219) terminated automatically

Sun Sep 22 03:45:10:508 2019


DpWpDynCreate: created new work process W17-7622

Sun Sep 22 03:48:47:745 2019


DpWpDynCreate: created new work process W18-8734

Sun Sep 22 03:50:23:579 2019


DpWpCheck: dyn W17, pid 7622 no longer needed, terminate now

Sun Sep 22 03:50:23:841 2019


DpHdlDeadWp: W17 (pid=7622) terminated automatically

Sun Sep 22 03:51:25:074 2019


DpWpDynCreate: created new work process W16-9684

Sun Sep 22 03:54:03:585 2019


DpWpCheck: dyn W18, pid 8734 no longer needed, terminate now

Sun Sep 22 03:54:04:104 2019


DpHdlDeadWp: W18 (pid=8734) terminated automatically

Sun Sep 22 03:54:10:727 2019


DpWpDynCreate: created new work process W19-10933

Sun Sep 22 03:54:32:888 2019


DpWpDynCreate: created new work process W20-11103

Sun Sep 22 03:56:26:196 2019


DpHdlDeadWp: W16 (pid=9684) terminated automatically

Sun Sep 22 03:59:23:592 2019


DpWpCheck: dyn W19, pid 10933 no longer needed, terminate now

Sun Sep 22 03:59:24:468 2019


DpHdlDeadWp: W19 (pid=10933) terminated automatically

Sun Sep 22 03:59:43:593 2019


DpWpCheck: dyn W20, pid 11103 no longer needed, terminate now

Sun Sep 22 03:59:44:499 2019


DpHdlDeadWp: W20 (pid=11103) terminated automatically

Sun Sep 22 04:00:07:283 2019


DpWpDynCreate: created new work process W17-12858

Sun Sep 22 04:00:19:208 2019


DpWpDynCreate: created new work process W18-12946

Sun Sep 22 04:00:23:046 2019


DpWpDynCreate: created new work process W16-12996

Sun Sep 22 04:05:23:603 2019


DpWpCheck: dyn W17, pid 12858 no longer needed, terminate now
DpWpCheck: dyn W18, pid 12946 no longer needed, terminate now

Sun Sep 22 04:05:23:838 2019


DpHdlDeadWp: W17 (pid=12858) terminated automatically
DpHdlDeadWp: W18 (pid=12946) terminated automatically

Sun Sep 22 04:05:43:604 2019


DpWpCheck: dyn W16, pid 12996 no longer needed, terminate now

Sun Sep 22 04:05:43:958 2019


DpHdlDeadWp: W16 (pid=12996) terminated automatically

Sun Sep 22 04:06:09:464 2019


DpWpDynCreate: created new work process W19-31312

Sun Sep 22 04:07:25:217 2019


DpWpDynCreate: created new work process W20-32398

Sun Sep 22 04:07:29:788 2019


DpWpDynCreate: created new work process W17-482

Sun Sep 22 04:07:43:608 2019


*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T65_U10696 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T65_U10696_M0 |001|EXT_MKARIM |SST-LAP-HP0002 |03:46:38|5 |
SAPLSDBACCMS |high| |
|DBACOCKPIT|
DpHdlSoftCancel: cancel request for T65_U10696_M0 received from DISP
(reason=DP_SOFTCANCEL_SAP_GUI_DISCONNECT)

Sun Sep 22 04:11:23:615 2019


DpWpCheck: dyn W19, pid 31312 no longer needed, terminate now

Sun Sep 22 04:11:24:248 2019


DpHdlDeadWp: W19 (pid=31312) terminated automatically

Sun Sep 22 04:12:34:543 2019


DpWpCheck: dyn W17, pid 482 no longer needed, terminate now
DpHdlDeadWp: W20 (pid=32398) terminated automatically

Sun Sep 22 04:12:35:660 2019


DpHdlDeadWp: W17 (pid=482) terminated automatically

Sun Sep 22 04:13:35:645 2019


DpWpDynCreate: created new work process W18-12765

Sun Sep 22 04:18:09:024 2019


DpWpDynCreate: created new work process W16-14086

Sun Sep 22 04:18:43:629 2019


DpWpCheck: dyn W18, pid 12765 no longer needed, terminate now

Sun Sep 22 04:18:44:072 2019


DpHdlDeadWp: W18 (pid=12765) terminated automatically

Sun Sep 22 04:18:51:265 2019


DpWpDynCreate: created new work process W19-14653

Sun Sep 22 04:23:23:638 2019


DpWpCheck: dyn W16, pid 14086 no longer needed, terminate now

Sun Sep 22 04:23:24:292 2019


DpHdlDeadWp: W16 (pid=14086) terminated automatically

Sun Sep 22 04:23:53:411 2019


DpHdlDeadWp: W19 (pid=14653) terminated automatically

Sun Sep 22 04:25:04:197 2019


DpWpDynCreate: created new work process W20-16417

Sun Sep 22 04:29:07:153 2019


DpWpDynCreate: created new work process W17-17739

Sun Sep 22 04:30:23:648 2019


DpWpCheck: dyn W20, pid 16417 no longer needed, terminate now

Sun Sep 22 04:30:25:048 2019


DpHdlDeadWp: W20 (pid=16417) terminated automatically

Sun Sep 22 04:30:28:023 2019


*** ERROR => DpHdlDeadWp: W6 (pid 32125) died (severity=0, status=9) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=32125) killed with signal 9
DpWpRecoverMutex: recover resources of W6 (pid = 32125)
********** SERVER SNAPSHOT 179 (Reason: Workprocess 6 died / Time: Sun Sep 22
04:30:28 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 04:30:28 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 8, standby_wps 0
#dia = 8
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 7
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 1
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 6
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 1
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 6
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 5
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 04:30:28 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 4 (peak 291, writeCount 24103036, readCount 24103032)


UPD : 0 (peak 31, writeCount 4952, readCount 4952)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125362, readCount 2125362)
SPO : 0 (peak 2, writeCount 25064, readCount 25064)
UP2 : 0 (peak 1, writeCount 2338, readCount 2338)
DISP: 0 (peak 67, writeCount 889413, readCount 889413)
GW : 0 (peak 49, writeCount 22410999, readCount 22410999)
ICM : 0 (peak 186, writeCount 391001, readCount 391001)
LWP : 1 (peak 16, writeCount 38169, readCount 38168)
Session queue dump (high priority, 0 elements, peak 39):
Session queue dump (normal priority, 0 elements, peak 85):
Session queue dump (low priority, 0 elements, peak 25):

Requests in queue <W2> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T124_U19927_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T30_U25456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T57_U19917_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T7_U25457_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 04:30:28 2019


------------------------------------------------------------

Current snapshot id: 179


DB clean time (in percent of total time) : 24.42 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10225|SAPMSSY1 |001|SM_EFWK |
| |
| 5|8468 |DIA |WP_RUN | | |high|T127_U9151_M0 |HTTP_NORM| | |
1| |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |1 | | | | | |
| | | | |
|
| 7|8188 |DIA |WP_RUN | | |norm|T124_U19927_M0 |ASYNC_RFC| | |
0| |000|SAP_WSRT | |
|
| 9|9709 |BTC |WP_RUN | | |low |T22_U19907_M0 |BATCH | | |
10| |001|BATCHUSER |REPLOAD |
|
| 10|7133 |BTC |WP_RUN | | |low |T65_U19742_M0 |BATCH | | |
82|/DYNAM/CL_REMOTE_TABLE========CP |001|BATCHUSER | |
|
| 11|4312 |BTC |WP_RUN | | |low |T44_U19719_M0 |BATCH | | |
1| |001|BATCHUSER | |
|
| 12|28810 |BTC |WP_RUN | |215|low |T2_U19914_M0 |BATCH | | |
13|/DYNAM/CL_DATASOURCE_ABAP=====CP |001|BATCHUSER | |
|
| 13|18331 |BTC |WP_RUN | | |low |T142_U19889_M0 |BATCH | | |
4|/DYNAM/DATA_LOAD_ABAP_J1 |001|BATCHUSER | |
|

Found 9 active workprocesses


Total number of workprocesses is 17

Session Table Sun Sep 22 04:30:28 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|BATCH |T2_U19914_M0 |001|BATCHUSER | |04:30:15|12
|/DYNAM/DATA_LOAD_ABAP_J1 |low | |
| | 12426|
|SYNC_RFC |T6_U19420_M0 |001|SAPJSF |smprd02.niladv.org |04:29:49|7 |
SAPMSSY1 |norm| |
| | 4246|
|RFC_UI |T7_U25457_M0 |001|EXT_SCHAITAN| |04:30:00|17 |
SAPMSSY1 |high|1 |
| | 4237|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|BGRFC_SCHEDU|T21_U18237_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:21|17 |
SAPMSSY1 |high| |
| | 4234|
|BATCH |T22_U19907_M0 |001|BATCHUSER | |04:30:18|9
|/DYNAM/DATA_LOAD_ABAP_J1 |low | |
| | 12426|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |1 |
| | 4237|
|BATCH |T44_U19719_M0 |001|BATCHUSER | |04:30:27|11
|/DYNAM/DATA_LOAD_ABAP_J1 |low | |
| | 12426|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|1 |
| | 4246|
|BATCH |T65_U19742_M0 |001|BATCHUSER | |04:29:06|10
|/DYNAM/DATA_LOAD_ABAP_J1 |low | |
| | 12426|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm| |
| | 4233|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm| |
| | 4248|
|GUI |T109_U5012_M0 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:07:48|17 |
SAPLSDBACCMS |high| |
|DBACOCKPIT| 24732|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:30:22|5 |
SBAL_DELETE |high| |
|SA38 | 12448|
|BGRFC_SCHEDU|T116_U18235_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:28:05|1 |
SAPMSSY1 |high| |
| | 4247|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm| |
| | 4233|
|SYNC_RFC |T122_U22843_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:30:15|7 |
SAPMSSY1 |norm| |
| | 8330|
|ASYNC_RFC |T124_U19927_M0 |000|SAP_WSRT |smprd02.niladv.org |04:30:28|7 |
SAPMSSY1 |norm|1 |
| | 4237|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm| |
| | 4249|
|HTTP_NORMAL |T127_U9151_M0 |001|EXT_SCHAITAN|10.1.88.10 |04:30:27|5 |
SAPMHTTP |high| |
| | 127516|
|BATCH |T142_U19889_M0 |001|BATCHUSER | |04:30:24|13
|/DYNAM/DATA_LOAD_ABAP_J1 |low | |
| | 12426|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|

Found 24 logons with 25 sessions


Total ES (gross) memory of all sessions: 299 MB
Most ES (gross) memory allocated by T127_U9151_M0: 124 MB

RFC-Connection Table (22 entries) Sun Sep 22 04:30:28 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52269707|52269707CU19742_M0 |T65_U19742_M0_I2|ALLOCATED |
CLIENT|NO_REQUEST| 10|Sun Sep 22 04:29:06 2019 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 49|52442513|52442513CU19914_M0 |T2_U19914_M0_I2 |ALLOCATED |
CLIENT|NO_REQUEST| 12|Sun Sep 22 04:30:15 2019 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 78|00115292|00115292SU22843_M0 |T122_U22843_M0_I|ALLOCATED |
SERVER|RECEIVE | 7|Sun Sep 22 04:30:15 2019 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 94|52423185|52423185CU19889_M0 |T142_U19889_M0_I|ALLOCATED |
CLIENT|NO_REQUEST| 13|Sun Sep 22 04:30:24 2019 |
| 114|52254546|52254546CU19719_M0 |T44_U19719_M0_I2|ALLOCATED |
CLIENT|NO_REQUEST| 11|Sun Sep 22 04:30:11 2019 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 127|52439397|52439397CU19907_M0 |T22_U19907_M0_I2|ALLOCATED |
CLIENT|NO_REQUEST| 9|Sun Sep 22 04:30:18 2019 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|SAP_SEND | 0|Sun Sep 22 04:27:16 2019 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|SAP_SEND | 4|Sun Sep 22 04:29:08 2019 |
| 210|42660276|42660276SU19420_M0 |T6_U19420_M0_I0 |ALLOCATED |
SERVER|SAP_SEND | 7|Sun Sep 22 04:29:49 2019 |
| 219|00116617|00116617CU22843_M0 |T122_U22843_M0_I|ALLOCATED |
CLIENT|NO_REQUEST| 3|Sat Sep 21 17:47:12 2019 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|SAP_SEND | 1|Sun Sep 22 04:27:05 2019 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 285|52455913|52455913SU19927_M0 |T124_U19927_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 6|Sun Sep 22 04:29:59 2019 |
| 287|36067373|36067373SU25457_M0 |T7_U25457_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 00:00:04 2019 |

Found 22 RFC-Connections

CA Blocks
------------------------------------------------------------
1 WORKER 32121
333 INVALID -1
334 INVALID -1
335 WORKER 7133
336 WORKER 4312
337 INVALID -1
338 WORKER 7133
339 INVALID -1
340 WORKER 28810
341 WORKER 18331
370 WORKER 9709
11 ca_blk slots of 6000 in use, 4 currently unowned (in request queues)

MPI Info Sun Sep 22 04:30:28 2019


------------------------------------------------------------
Current pipes in use: 3
Current / maximal blocks in use: 1 / 1884

Periodic Tasks Sun Sep 22 04:30:28 2019


------------------------------------------------------------
|Handle |Type |Calls |Wait(sec) |Session |Resp-ID
|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5694| 92| |
|
| 1|DDLOG | 5694| 92| |
|
| 2|BTCSCHED | 11389| 36| |
|
| 3|RESTART_ALL | 2278| 268| |
|
| 4|ENVCHECK | 34170| 15| |
|
| 5|AUTOABAP | 2278| 268| |
|
| 6|BGRFC_WATCHDOG | 2279| 268| |
|
| 7|AUTOTH | 271| 36| |
|
| 8|AUTOCCMS | 11389| 36| |
|
| 9|AUTOSECURITY | 11389| 36| |
|
| 10|LOAD_CALCULATION | 682468| 0| |
|
| 11|SPOOLALRM | 11390| 36| |
|
| 12|TIMEOUT | 0| 32|T30_U25456_M0 |
64717436|
| 13|TIMEOUT | 0| 32|T7_U25457_M0 |
64717301|
| 14|TIMEOUT | 0| 29|T57_U19917_M0 |
64716256|
| 15|CALL_DELAYED | 0| 89| |
|
| 16|TIMEOUT | 0| 233|T21_U18237_M0 |
64707280|
| 17|TIMEOUT | 0| 157|T116_U18235_M0 |
64695164|

Found 18 periodic tasks

********** SERVER SNAPSHOT 179 (Reason: Workprocess 6 died / Time: Sun Sep 22
04:30:28 2019) - end **********

Sun Sep 22 04:30:28:538 2019


***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]
DpWpDynCreate: created new work process W6-19703

Sun Sep 22 04:30:30:663 2019


*** ERROR => DpHdlDeadWp: W6 (pid 19703) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19703) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 19703)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
Sun Sep 22 04:30:42:432 2019
DpHdlSoftCancel: cancel request for T59_U19947_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:30:42:904 2019


DpHdlSoftCancel: cancel request for T67_U19949_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:30:43:284 2019


DpHdlSoftCancel: cancel request for T4_U19950_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:30:43:648 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:30:47:918 2019


DpHdlDeadWp: W1 (pid=23287) terminated automatically
DpWpDynCreate: created new work process W1-19786
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:30:48:991 2019


*** ERROR => DpHdlDeadWp: W1 (pid 19786) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19786) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 19786)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:30:49:993 2019


DpSendLoadInfo: quota for load / queue fill level = 6.300000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 6.443251 /
0.128571

Sun Sep 22 04:30:57:851 2019


DpHdlSoftCancel: cancel request for T145_U19951_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T153_U19952_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:31:03:648 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:31:07:861 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:31:08:254 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:31:12:347 2019


DpHdlSoftCancel: cancel request for T109_U5012_M1 received from TERMINAL
(reason=DP_SOFTCANCEL_ABORT_PROGRAM)

Sun Sep 22 04:31:17:957 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpHdlDeadWp: W5 (pid=8468) terminated automatically
DpWpDynCreate: created new work process W5-19970
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:31:18:277 2019


DpHdlSoftCancel: cancel request for T55_U19953_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:31:19:363 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 19970) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19970) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 19970)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:31:20:811 2019


DpHdlSoftCancel: cancel request for T134_U19954_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:31:23:774 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpHdlDeadWp: W7 (pid=8188) terminated automatically
DpWpDynCreate: created new work process W7-19973
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:31:25:442 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 19973) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=19973) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 19973)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:31:28:539 2019


DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 19975
Sun Sep 22 04:31:37:982 2019
DpHdlSoftCancel: cancel request for T2_U19957_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:31:43:774 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:31:58:148 2019


DpHdlSoftCancel: cancel request for T142_U19959_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:32:00:880 2019


DpHdlSoftCancel: cancel request for T22_U19960_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:32:03:775 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:32:04:131 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:32:06:770 2019


DpHdlDeadWp: W0 (pid=31517) terminated automatically
DpWpDynCreate: created new work process W0-20739
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
Sun Sep 22 04:32:08:597 2019
*** ERROR => DpHdlDeadWp: W0 (pid 20739) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=20739) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 20739)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:32:23:775 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 19975 terminated

Sun Sep 22 04:32:43:776 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:32:55:457 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
Sun Sep 22 04:32:58:191 2019
DpHdlSoftCancel: cancel request for T65_U19963_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T67_U19964_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:03:196 2019


DpHdlSoftCancel: cancel request for T52_U19972_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T54_U19975_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T49_U19974_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T119_U19971_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T105_U19973_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T80_U19969_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T127_U19966_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T44_U19965_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T113_U19967_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T68_U19968_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:03:776 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:33:04:132 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:33:08:201 2019


DpHdlSoftCancel: cancel request for T114_U20014_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:13:206 2019


DpHdlSoftCancel: cancel request for T122_U19979_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:23:777 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:33:28:100 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:33:28:221 2019


DpHdlSoftCancel: cancel request for T7_U19982_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:33:222 2019


DpHdlSoftCancel: cancel request for T82_U20013_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T153_U19985_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T17_U19983_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T107_U19987_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T145_U19989_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T6_U19988_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:38:227 2019


DpHdlSoftCancel: cancel request for T4_U19990_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:43:232 2019


DpHdlSoftCancel: cancel request for T14_U19991_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:33:43:777 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:33:53:257 2019


DpHdlSoftCancel: cancel request for T124_U19994_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:34:03:778 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:34:04:132 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:34:18:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
DpHdlDeadWp: W17 (pid=17739) terminated automatically

Sun Sep 22 04:34:23:779 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:34:43:779 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:34:53:553 2019


DpHdlSoftCancel: cancel request for T39_U20016_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T8_U20017_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:34:56:900 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.936211 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.935940 /
0.000000

Sun Sep 22 04:34:59:421 2019


DpHdlSoftCancel: cancel request for T83_U20022_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T46_U20023_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T18_U20020_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:35:00:901 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 04:35:03:780 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-21455
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-21456
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-21457
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-21458
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-21459
DpSendLoadInfo: queue DIA no longer with high load

Sun Sep 22 04:35:04:133 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:35:04:425 2019


DpHdlSoftCancel: cancel request for T110_U20031_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T99_U20032_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T34_U20024_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T56_U20025_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T13_U20026_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T63_U20029_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T60_U20028_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T66_U20030_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:35:04:931 2019


*** ERROR => DpHdlDeadWp: W0 (pid 21455) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21455) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 21455)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 21456) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21456) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 21456)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 21457) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21457) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 21457)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 21458) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21458) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 21458)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 21459) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=21459) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 21459)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:35:05:939 2019


DpSendLoadInfo: quota for load / queue fill level = 2.700000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 2.999888 /
1.021429

Sun Sep 22 04:35:08:189 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:35:17:535 2019


DpHdlSoftCancel: cancel request for T81_U20034_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:35:23:780 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:35:28:207 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:35:33:546 2019


DpHdlSoftCancel: cancel request for T2_U20040_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T94_U20042_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T101_U20038_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T27_U20043_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T116_U20039_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T5_U20044_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:35:38:549 2019


DpHdlSoftCancel: cancel request for T62_U20045_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:35:42:420 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.949987 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.949642 /
0.000000

Sun Sep 22 04:35:43:554 2019


DpHdlSoftCancel: cancel request for T134_U20046_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:35:43:781 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:35:46:971 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 04:35:53:564 2019


DpHdlSoftCancel: cancel request for T7_U20048_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:36:03:782 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpWpCheck: signal handling in W1 timed out, kill now [dpxxwp.c
809]
***LOG Q02=> DpWpCheck, SigH Timeout (Workp. 1 -1) [dpxxwp.c 812]
DpWpRecoverMutex: recover resources of W1 (pid = 4294967295)
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:36:04:133 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:36:23:782 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpWpCheck: signal handling in W5 timed out, kill now [dpxxwp.c
809]
***LOG Q02=> DpWpCheck, SigH Timeout (Workp. 5 -1) [dpxxwp.c 812]
DpWpRecoverMutex: recover resources of W5 (pid = 4294967295)
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:31:28 2019, skip new
snapshot

Sun Sep 22 04:36:43:783 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 180 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:36:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 04:36:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 1
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 04:36:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 182 (peak 291, writeCount 24103369, readCount 24103187)


UPD : 0 (peak 31, writeCount 4954, readCount 4954)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125378, readCount 2125378)
SPO : 0 (peak 2, writeCount 25073, readCount 25073)
UP2 : 0 (peak 1, writeCount 2340, readCount 2340)
DISP: 0 (peak 67, writeCount 889520, readCount 889520)
GW : 0 (peak 49, writeCount 22411016, readCount 22411016)
ICM : 1 (peak 186, writeCount 391018, readCount 391017)
LWP : 1 (peak 16, writeCount 38192, readCount 38191)

Session queue dump (high priority, 1 elements, peak 39):


-1 <- 26 < EmbryoQueue_DIA> -> -1
Session queue dump (normal priority, 76 elements, peak 85):
-1 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 37
155 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 72
101 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 115
43 <- 115 < T46_U20023_M0> -> 68
115 <- 68 < T83_U20022_M0> -> 92
68 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 103
156 <- 103 < T145_U19989_M0> -> 32
103 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 113
109 <- 113 < T82_U20013_M0> -> 66
113 <- 66 < T114_U20014_M0> -> 36
66 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 50
98 <- 50 < T80_U19969_M0> -> 79
50 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 145
134 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 120
73 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 142
185 <- 142 < T48_U20071_M0> -> 64
142 <- 64 < T111_U20073_M0> -> 85
64 <- 85 < T117_U20074_M0> -> 47
85 <- 47 < T4_U20075_M0> -> 130
47 <- 130 < T43_U20076_M0> -> 172
130 <- 172 < T85_U20077_M0> -> 108
172 <- 108 < T6_U20079_M0> -> 65
108 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 169
53 <- 169 < T28_U20082_M0> -> 70
169 <- 70 < T108_U20083_M0> -> 31
70 <- 31 < T138_U20086_M0> -> 132
31 <- 132 < T140_U20098_M0> -> 52
132 <- 52 < T26_U20099_M0> -> 63
52 <- 63 < T41_U20100_M0> -> 119
63 <- 119 < T11_U20102_M0> -> 77
119 <- 77 < T0_U20103_M0> -> 86
77 <- 86 < T32_U20104_M0> -> 163
86 <- 163 < T133_U20105_M0> -> 61
163 <- 61 < T104_U20107_M0> -> 187
61 <- 187 < T112_U20108_M0> -> 93
187 <- 93 < T70_U20109_M0> -> 99
93 <- 99 < T139_U20112_M0> -> 69
99 <- 69 < T61_U20113_M0> -> 170
69 <- 170 < T93_U20114_M0> -> 55
170 <- 55 < T96_U20125_M0> -> -1
Session queue dump (low priority, 1 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> -1

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W2> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <EmbryoQueue_DIA> (4 requests):
- 1 requests for handler REQ_HANDLER_DDLOG
- 1 requests for handler REQ_HANDLER_AUTOTH
- 1 requests for handler REQ_HANDLER_BTCSCHED
- 1 requests for handler REQ_HANDLER_AUTOSECURITY
Requests in queue <T138_U20086_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T107_U19987_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T18_U20020_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T116_U20039_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T26_U20099_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T124_U20081_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T41_U20100_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T111_U20073_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T29_U20080_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T114_U20014_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (8 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T108_U20083_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T99_U20032_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T105_U19973_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T113_U19967_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T32_U20104_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T54_U19975_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T127_U19966_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T34_U20024_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U19989_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T49_U19974_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T153_U19985_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20102_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T140_U20098_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T52_U19972_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T17_U19983_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20105_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T28_U20082_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T93_U20114_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T85_U20077_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T60_U20028_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20108_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 04:36:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 180
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |2 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |208|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10262|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |high|T21_U20124_M0 |INTERNAL | | |
8| | | | |
|
| 4|19804 |DIA |WP_RUN | | |high|T14_U20122_M0 |INTERNAL | | |
5| | | | |
|
| 5| |DIA |WP_KILL| |2 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |3 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |2 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 04:36:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|2 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|4 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|2 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20102_M0 | | |10.54.36.11 |04:35:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|5 | | |
0|
|INTERNAL |T14_U20122_M0 | | | |04:36:38|4 |
|high| | | |
4200|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|2 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|2 | | |
0|
|INTERNAL |T21_U20124_M0 | | | |04:36:35|3 |
|high| | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|4 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|1 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|6 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|1 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|2 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|2 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|4 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|8 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|2 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|2 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|4 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|1 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|2 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|1 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm| |
| | 4233|
|HTTP_NORMAL |T93_U20114_M0 | | |10.54.36.27 |04:35:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|2 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm| |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|2 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T108_U20083_M0 | | |10.54.36.38 |04:35:03| |
|norm|1 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T112_U20108_M0 | | |10.54.36.29 |04:35:37| |
|norm|9 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|4 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|2 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|1 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|3 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|1 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm| |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|5 | | |
0|
|HTTP_NORMAL |T133_U20105_M0 | | |10.54.36.14 |04:35:35| |
|norm|1 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|2 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|2 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T145_U19989_M0 | | |10.54.36.28 |04:31:32| |
|norm|5 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|4 | | |
0|

Found 91 logons with 91 sessions


Total ES (gross) memory of all sessions: 79 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

RFC-Connection Table (22 entries) Sun Sep 22 04:36:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|SAP_SEND | 0|Sun Sep 22 04:27:16 2019 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|SAP_SEND | 4|Sun Sep 22 04:29:08 2019 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|SAP_SEND | 1|Sun Sep 22 04:27:05 2019 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
Found 22 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
15 ca_blk slots of 6000 in use, 13 currently unowned (in request queues)

MPI Info Sun Sep 22 04:36:43 2019


------------------------------------------------------------
Current pipes in use: 129
Current / maximal blocks in use: 118 / 1884

Periodic Tasks Sun Sep 22 04:36:43 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5697| 77| |
|
| 1|DDLOG | 5697| 77| |
|
| 2|BTCSCHED | 11394| 21| |
|
| 3|RESTART_ALL | 2279| 193| |
|
| 4|ENVCHECK | 34189| 20| |
|
| 5|AUTOABAP | 2279| 193| |
|
| 6|BGRFC_WATCHDOG | 2280| 193| |
|
| 7|AUTOTH | 277| 21| |
|
| 8|AUTOCCMS | 11394| 21| |
|
| 9|AUTOSECURITY | 11394| 21| |
|
| 10|LOAD_CALCULATION | 682840| 1| |
|
| 11|SPOOLALRM | 11396| 21| |
|
| 12|CALL_DELAYED | 0| 864| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 180 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:36:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]

Sun Sep 22 04:36:43:792 2019


DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpWpCheck: signal handling in W7 timed out, kill now [dpxxwp.c
809]
***LOG Q02=> DpWpCheck, SigH Timeout (Workp. 7 -1) [dpxxwp.c 812]
DpWpRecoverMutex: recover resources of W7 (pid = 4294967295)

Sun Sep 22 04:36:55:706 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:36:59:513 2019


DpHdlSoftCancel: cancel request for T117_U20074_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T48_U20071_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T111_U20073_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:37:03:783 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:37:04:134 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:37:04:517 2019


DpHdlSoftCancel: cancel request for T28_U20082_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T4_U20075_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T43_U20076_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T124_U20081_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T29_U20080_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T85_U20077_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T6_U20079_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T108_U20083_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:37:05:054 2019


DpUpdateStatusFileWith: state=YELLOW, reason=Request handling without progress
*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (1. check) [dpxxwp.c 4705]

Sun Sep 22 04:37:08:292 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:37:17:662 2019


DpHdlSoftCancel: cancel request for T138_U20086_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:37:23:784 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpWpCheck: signal handling in W0 timed out, kill now [dpxxwp.c
809]
***LOG Q02=> DpWpCheck, SigH Timeout (Workp. 0 -1) [dpxxwp.c 812]
DpWpRecoverMutex: recover resources of W0 (pid = 4294967295)
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:37:33:672 2019


DpHdlSoftCancel: cancel request for T32_U20104_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T140_U20098_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T26_U20099_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T0_U20103_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T41_U20100_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T11_U20102_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:37:35:739 2019


DpHdlSoftCancel: cancel request for T133_U20105_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:37:40:743 2019


DpHdlSoftCancel: cancel request for T70_U20109_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:37:43:785 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 22395

Sun Sep 22 04:37:50:749 2019


DpHdlSoftCancel: cancel request for T61_U20113_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T93_U20114_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:38:03:785 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:38:04:134 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:38:05:761 2019


DpHdlSoftCancel: cancel request for T89_U20156_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:38:23:786 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:38:25:780 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:38:33:781 2019


DpHdlSoftCancel: cancel request for T108_U20154_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:38:35:789 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:38:43:787 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 22395 terminated

Sun Sep 22 04:38:53:793 2019


DpHdlSoftCancel: cancel request for T100_U20128_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:38:55:806 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:38:58:798 2019


DpHdlSoftCancel: cancel request for T16_U20130_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:39:03:788 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
DpHdlSoftCancel: cancel request for T37_U20132_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T90_U20133_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T40_U20136_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T120_U20137_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T50_U20135_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T9_U20138_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:39:04:135 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:39:13:801 2019


DpHdlSoftCancel: cancel request for T84_U20142_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:39:23:788 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:39:33:833 2019


DpHdlSoftCancel: cancel request for T1_U20146_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T64_U20148_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T149_U20149_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T14_U20145_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:39:35:859 2019


DpHdlSoftCancel: cancel request for T58_U20151_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: delete in progress for T92_U20186_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T71_U20150_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:39:43:789 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:39:48:477 2019


DpHdlSoftCancel: delete in progress for T76_U20187_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:39:55:893 2019


DpHdlSoftCancel: ignore cancel for T33_U20200_M0

Sun Sep 22 04:39:58:989 2019


DpHdlSoftCancel: cancel request for T133_U20159_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:40:03:789 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-23587
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-23588
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-23589
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-23590
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-23591
Sun Sep 22 04:40:03:993 2019
DpHdlSoftCancel: cancel request for T91_U20163_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T15_U20162_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T88_U20160_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:40:04:136 2019


DpSendLoadInfo: queue DIA no longer with high load
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:40:05:464 2019


*** ERROR => DpHdlDeadWp: W0 (pid 23587) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23587) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 23587)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 23588) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23588) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 23588)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 23589) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23589) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 23589)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 23590) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23590) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 23590)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 23591) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=23591) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 23591)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:40:06:465 2019


DpSendLoadInfo: quota for load / queue fill level = 2.700000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 2.999995 /
2.178571

Sun Sep 22 04:40:08:519 2019


DpHdlSoftCancel: delete in progress for T19_U20189_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:40:15:909 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
Sun Sep 22 04:40:23:790 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:40:28:294 2019


DpHdlSoftCancel: cancel request for T31_U20168_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:40:38:301 2019


DpHdlSoftCancel: cancel request for T112_U20108_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:40:43:301 2019


DpHdlSoftCancel: cancel request for T77_U20172_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:40:43:790 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:40:53:309 2019


DpHdlSoftCancel: cancel request for T23_U20175_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T53_U20177_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T73_U20178_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:41:03:314 2019


DpHdlSoftCancel: cancel request for T129_U20184_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T93_U20181_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T115_U20182_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:41:03:790 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:41:08:319 2019


DpHdlSoftCancel: cancel request for T123_U20185_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:41:17:518 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.946851 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.946752 /
0.000000

Sun Sep 22 04:41:18:579 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:41:21:522 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 04:41:23:791 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:41:35:974 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:41:38:338 2019


DpHdlSoftCancel: cancel request for T21_U20194_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T92_U20196_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T100_U20195_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:41:43:791 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:41:58:357 2019


DpHdlSoftCancel: cancel request for T76_U20199_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:42:03:362 2019


DpHdlSoftCancel: cancel request for T35_U20203_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T25_U20202_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:42:03:791 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:42:04:137 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:42:13:368 2019


DpHdlSoftCancel: cancel request for T42_U20205_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:42:23:792 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:37:43 2019, skip new
snapshot

Sun Sep 22 04:42:33:013 2019


DpHdlSoftCancel: cancel request for T102_U20210_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T79_U20212_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:42:38:015 2019


DpHdlSoftCancel: cancel request for T106_U20213_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:42:43:793 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 181 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:42:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 04:42:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 1
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 04:42:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 391 (peak 393, writeCount 24103673, readCount 24103282)


UPD : 0 (peak 31, writeCount 4955, readCount 4955)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125383, readCount 2125383)
SPO : 0 (peak 2, writeCount 25080, readCount 25080)
UP2 : 0 (peak 1, writeCount 2341, readCount 2341)
DISP: 0 (peak 67, writeCount 889626, readCount 889626)
GW : 0 (peak 49, writeCount 22411017, readCount 22411017)
ICM : 0 (peak 186, writeCount 391032, readCount 391032)
LWP : 6 (peak 16, writeCount 38206, readCount 38200)

Session queue dump (high priority, 1 elements, peak 39):


-1 <- 26 < EmbryoQueue_DIA> -> -1
Session queue dump (normal priority, 149 elements, peak 149):
-1 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 59
74 <- 59 < T25_U20202_M0> -> 56
59 <- 56 < T35_U20203_M0> -> 45
56 <- 45 < T76_U20199_M0> -> 32
45 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 39
109 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 79
171 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 148
73 <- 148 < T123_U20185_M0> -> 50
148 <- 50 < T80_U19969_M0> -> 147
50 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 153
170 <- 153 < T129_U20184_M0> -> 129
153 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 54
144 <- 54 < T23_U20175_M0> -> 58
54 <- 58 < T77_U20172_M0> -> 187
58 <- 187 < T112_U20108_M0> -> 117
187 <- 117 < T31_U20168_M0> -> 78
117 <- 78 < T88_U20160_M0> -> 60
78 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 163
124 <- 163 < T133_U20159_M0> -> 135
163 <- 135 < T71_U20150_M0> -> 118
135 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 125
105 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 94
42 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 70
167 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 69
114 <- 69 < T61_U20113_M0> -> 93
69 <- 93 < T70_U20109_M0> -> 119
93 <- 119 < T11_U20102_M0> -> 63
119 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 86
132 <- 86 < T32_U20104_M0> -> 31
86 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 37
155 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 72
101 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 115
43 <- 115 < T46_U20023_M0> -> 68
115 <- 68 < T83_U20022_M0> -> 92
68 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 113
156 <- 113 < T82_U20013_M0> -> 66
113 <- 66 < T114_U20014_M0> -> 120
66 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 100
40 <- 100 < T38_U20216_M0> -> 71
100 <- 71 < T36_U20219_M0> -> 89
71 <- 89 < T12_U20220_M0> -> 76
89 <- 76 < T20_U20221_M0> -> 166
76 <- 166 < T156_U20224_M0> -> 146
166 <- 146 < T132_U20225_M0> -> 62
146 <- 62 < T121_U20226_M0> -> 188
62 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 160
131 <- 160 < T148_U20238_M0> -> 152
160 <- 152 < T78_U20239_M0> -> 183
152 <- 183 < T152_U20240_M0> -> 175
183 <- 175 < T144_U20241_M0> -> 165
175 <- 165 < T75_U20246_M0> -> 180
165 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 168
154 <- 168 < T146_U20253_M0> -> 174
168 <- 174 < T131_U20254_M0> -> 184
174 <- 184 < T135_U20255_M0> -> 128
184 <- 128 < T155_U20258_M0> -> 190
128 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 75
158 <- 75 < T33_U20262_M0> -> 176
75 <- 176 < T157_U20264_M0> -> 182
176 <- 182 < T151_U20265_M0> -> 178
182 <- 178 < T130_U20267_M0> -> 189
178 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 192
191 <- 192 < T161_U20275_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W0> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W2> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W5> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W6> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W7> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <EmbryoQueue_DIA> (2 requests):
- 1 requests for handler REQ_HANDLER_BTCSCHED
- 1 requests for handler REQ_HANDLER_AUTOSECURITY
Requests in queue <T138_U20086_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (8 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T41_U20100_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T0_U20103_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (20 requests):
- 19 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (8 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T8_U20017_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T34_U20024_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20102_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T140_U20098_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20150_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T115_U20182_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T129_U20184_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T81_U20034_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T50_U20135_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20246_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T156_U20224_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T16_U20130_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20254_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T144_U20241_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T157_U20264_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T90_U20133_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T152_U20240_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T135_U20255_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20108_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T125_U20235_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T158_U20273_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T159_U20259_M0> (5 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T160_U20274_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T161_U20275_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 04:42:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 181
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |3 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |209|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10298|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |high|T145_U20276_M0 |INTERNAL | | |
7| | | | |
|
| 4|19804 |DIA |WP_RUN | | |high|T19_U20271_M0 |INTERNAL | | |
5| |000|SAPSYS | |
|
| 5| |DIA |WP_KILL| |3 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |4 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |3 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 04:42:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|2 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|5 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|2 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20102_M0 | | |10.54.36.11 |04:35:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|8 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|2 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|2 | | |
0|
|INTERNAL |T19_U20271_M0 |000|SAPSYS | |04:42:38|4 |
|high| | | |
4200|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|9 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|5 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|2 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|2 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|2 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|2 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|9 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|6 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|2 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|2 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|2 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|4 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|2 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|9 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|2 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|2 | | |
0|
|HTTP_NORMAL |T71_U20150_M0 | | |10.54.36.28 |04:37:34| |
|norm|3 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|2 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T75_U20246_M0 | | |10.54.36.35 |04:41:27| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|2 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|2 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|2 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|2 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|20 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|2 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|2 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|2 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|2 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T112_U20108_M0 | | |10.54.36.29 |04:35:37| |
|norm|10 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|2 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|2 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|8 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|1 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|2 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T131_U20254_M0 | | |10.54.36.14 |04:41:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|2 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|1 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|3 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|2 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|1 | | |
0|
|INTERNAL |T145_U20276_M0 | | | |04:42:36|3 |
|high| | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|1 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|5 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T161_U20275_M0 | | |10.54.36.28 |04:42:34| |
|norm|1 | | |
0|

Found 162 logons with 162 sessions


Total ES (gross) memory of all sessions: 79 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

Force ABAP stack dump of session T88_U20160_M0


Force ABAP stack dump of session T112_U20108_M0

RFC-Connection Table (34 entries) Sun Sep 22 04:42:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 34 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
30 ca_blk slots of 6000 in use, 28 currently unowned (in request queues)

MPI Info Sun Sep 22 04:42:43 2019


------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 208 / 1884

Periodic Tasks Sun Sep 22 04:42:43 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5700| 77| |
|
| 1|DDLOG | 5700| 77| |
|
| 2|BTCSCHED | 11399| 21| |
|
| 3|RESTART_ALL | 2280| 133| |
|
| 4|ENVCHECK | 34207| 20| |
|
| 5|AUTOABAP | 2280| 133| |
|
| 6|BGRFC_WATCHDOG | 2281| 133| |
|
| 7|AUTOTH | 282| 21| |
|
| 8|AUTOCCMS | 11399| 21| |
|
| 9|AUTOSECURITY | 11398| 21| |
|
| 10|LOAD_CALCULATION | 683200| 1| |
|
| 11|SPOOLALRM | 11402| 21| |
|
| 12|CALL_DELAYED | 0| 504| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 181 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:42:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]

Sun Sep 22 04:42:43:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:42:48:658 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:42:53:798 2019


DpHdlSoftCancel: cancel request for T12_U20220_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T20_U20221_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:42:58:801 2019


DpHdlSoftCancel: cancel request for T156_U20224_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T132_U20225_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:43:02:613 2019


DpSendLoadInfo: queue DIA no longer with high load
Sun Sep 22 04:43:03:794 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpHdlSoftCancel: cancel request for T78_U20239_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T152_U20240_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T148_U20238_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T121_U20226_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T137_U20236_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T125_U20235_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:43:04:137 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:43:08:674 2019


DpHdlSoftCancel: cancel request for T162_U20279_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T144_U20241_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:43:17:631 2019


DpSendLoadInfo: quota for load / queue fill level = 2.700000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 2.749953 /
3.014286

Sun Sep 22 04:43:23:794 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:43:28:693 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:43:28:836 2019


DpHdlSoftCancel: cancel request for T75_U20246_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
Sun Sep 22 04:43:33:841 2019
DpHdlSoftCancel: cancel request for T19_U20278_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T150_U20251_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T143_U20250_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:43:38:844 2019


DpHdlSoftCancel: cancel request for T131_U20254_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:43:43:795 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 24524
DpHdlSoftCancel: cancel request for T135_U20255_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:43:53:857 2019


DpHdlSoftCancel: cancel request for T159_U20259_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:43:59:346 2019


DpHdlSoftCancel: cancel request for T154_U20261_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:44:03:796 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:44:04:138 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:44:07:665 2019


DpHdlSoftCancel: cancel request for T151_U20265_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T157_U20264_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:44:20:931 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:44:23:796 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 24524 terminated

Sun Sep 22 04:44:33:684 2019


DpHdlSoftCancel: cancel request for T160_U20274_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T158_U20273_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:44:38:688 2019


DpHdlSoftCancel: cancel request for T161_U20275_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:44:43:796 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:44:48:771 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:44:58:702 2019


DpHdlSoftCancel: cancel request for T145_U20282_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:45:02:722 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:45:03:702 2019


DpHdlSoftCancel: cancel request for T168_U20288_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T164_U20283_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T165_U20284_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-25462
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-25463
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-25464
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-25465
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-25466

Sun Sep 22 04:45:04:138 2019


DpSendLoadInfo: queue DIA no longer with high load
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:45:05:299 2019


*** ERROR => DpHdlDeadWp: W0 (pid 25462) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25462) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 25462)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 25463) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25463) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 25463)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 25464) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25464) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 25464)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 25465) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25465) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 25465)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 25466) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=25466) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 25466)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:45:06:306 2019


DpSendLoadInfo: quota for load / queue fill level = 2.700000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 2.902324 /
3.400000

Sun Sep 22 04:45:08:705 2019


DpHdlSoftCancel: cancel request for T169_U20289_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:45:16:924 2019


DpHdlSoftCancel: cancel request for T170_U20291_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:45:23:798 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:45:28:804 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:45:28:936 2019


DpHdlSoftCancel: cancel request for T171_U20295_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:45:33:939 2019


DpHdlSoftCancel: cancel request for T172_U20298_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:45:38:944 2019


DpHdlSoftCancel: cancel request for T173_U20299_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T38_U20216_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:45:43:798 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:45:43:949 2019


DpHdlSoftCancel: cancel request for T174_U20300_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:45:53:958 2019


DpHdlSoftCancel: cancel request for T177_U20306_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T176_U20305_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T175_U20304_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:45:55:346 2019


DpSendLoadInfo: queue DIA no longer with high load

Sun Sep 22 04:45:58:958 2019


DpHdlSoftCancel: cancel request for T179_U20309_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:46:03:799 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:46:03:961 2019


DpHdlSoftCancel: cancel request for T181_U20313_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T182_U20314_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T180_U20310_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:46:04:139 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:46:09:194 2019


DpHdlSoftCancel: cancel request for T184_U20316_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T185_U20317_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:46:10:357 2019


DpSendLoadInfo: quota for load / queue fill level = 2.700000 / 5.000000
DpSendLoadInfo: queue DIA now with high load, load / queue fill level = 2.749969 /
4.071429

Sun Sep 22 04:46:23:799 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:46:33:211 2019


DpHdlSoftCancel: cancel request for T187_U20325_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:46:38:216 2019


DpHdlSoftCancel: cancel request for T188_U20326_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:46:43:800 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:46:52:391 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.907023 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.907103 /
0.000000

Sun Sep 22 04:46:55:392 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:46:56:393 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 04:47:03:483 2019


DpHdlSoftCancel: cancel request for T189_U20331_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:47:03:801 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:47:04:140 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:47:13:484 2019


DpHdlSoftCancel: cancel request for T191_U20336_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:47:23:801 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:47:33:500 2019


DpHdlSoftCancel: cancel request for T193_U20341_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T194_U20343_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:47:38:505 2019


DpHdlSoftCancel: cancel request for T195_U20345_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:47:43:802 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:47:44:289 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:47:53:515 2019


DpHdlSoftCancel: cancel request for T161_U20349_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T190_U20350_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:47:58:519 2019


DpHdlSoftCancel: cancel request for T199_U20353_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:48:03:525 2019


DpHdlSoftCancel: cancel request for T200_U20354_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T202_U20357_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T201_U20355_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T204_U20359_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T203_U20358_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T205_U20360_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:48:03:803 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:48:04:141 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:48:09:309 2019


DpHdlSoftCancel: cancel request for T212_U20408_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T207_U20363_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T206_U20362_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:48:23:804 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:43:43 2019, skip new
snapshot

Sun Sep 22 04:48:33:329 2019


DpHdlSoftCancel: cancel request for T209_U20370_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T208_U20368_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
Sun Sep 22 04:48:38:333 2019
DpHdlSoftCancel: cancel request for T11_U20406_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T219_U20381_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T210_U20371_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:48:40:780 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:48:43:337 2019


DpHdlSoftCancel: cancel request for T221_U20383_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:48:43:805 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 182 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:48:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 04:48:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 2
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 04:48:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 659 (peak 661, writeCount 24104043, readCount 24103384)


UPD : 0 (peak 31, writeCount 4956, readCount 4956)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125388, readCount 2125388)
SPO : 0 (peak 2, writeCount 25087, readCount 25087)
UP2 : 0 (peak 1, writeCount 2342, readCount 2342)
DISP: 0 (peak 67, writeCount 889743, readCount 889743)
GW : 0 (peak 49, writeCount 22411018, readCount 22411018)
ICM : 0 (peak 186, writeCount 391049, readCount 391049)
LWP : 6 (peak 16, writeCount 38220, readCount 38214)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 223 elements, peak 224):
-1 <- 252 < T221_U20383_M0> -> 250
252 <- 250 < T219_U20381_M0> -> 119
250 <- 119 < T11_U20406_M0> -> 117
119 <- 117 < T31_U20168_M0> -> 239
117 <- 239 < T208_U20368_M0> -> 240
239 <- 240 < T209_U20370_M0> -> 60
240 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 237
124 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 242
238 <- 242 < T212_U20408_M0> -> 163
242 <- 163 < T133_U20159_M0> -> 78
163 <- 78 < T88_U20160_M0> -> 236
78 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 231
233 <- 231 < T200_U20354_M0> -> 230
231 <- 230 < T199_U20353_M0> -> 70
230 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 221
114 <- 221 < T190_U20350_M0> -> 192
221 <- 192 < T161_U20349_M0> -> 118
192 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 226
105 <- 226 < T195_U20345_M0> -> 225
226 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 125
224 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 222
42 <- 222 < T191_U20336_M0> -> 94
222 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 220
167 <- 220 < T189_U20331_M0> -> 219
220 <- 219 < T188_U20326_M0> -> 218
219 <- 218 < T187_U20325_M0> -> 216
218 <- 216 < T185_U20317_M0> -> 215
216 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 213
211 <- 213 < T182_U20314_M0> -> 212
213 <- 212 < T181_U20313_M0> -> 69
212 <- 69 < T61_U20113_M0> -> 210
69 <- 210 < T179_U20309_M0> -> 206
210 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 93
208 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 205
63 <- 205 < T174_U20300_M0> -> 77
205 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 100
132 <- 100 < T38_U20216_M0> -> 204
100 <- 204 < T173_U20299_M0> -> 203
204 <- 203 < T172_U20298_M0> -> 202
203 <- 202 < T171_U20295_M0> -> 31
202 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 201
65 <- 201 < T170_U20291_M0> -> 53
201 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 200
169 <- 200 < T169_U20289_M0> -> 64
200 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 196
85 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 199
195 <- 199 < T168_U20288_M0> -> 103
199 <- 103 < T145_U20282_M0> -> 189
103 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 176
191 <- 176 < T157_U20264_M0> -> 182
176 <- 182 < T151_U20265_M0> -> 151
182 <- 151 < T7_U20048_M0> -> 158
151 <- 158 < T154_U20261_M0> -> 190
158 <- 190 < T159_U20259_M0> -> 137
190 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 184
46 <- 184 < T135_U20255_M0> -> 186
184 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 180
116 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 91
154 <- 91 < T19_U20278_M0> -> 165
91 <- 165 < T75_U20246_M0> -> 155
165 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 175
112 <- 175 < T144_U20241_M0> -> 43
175 <- 43 < T18_U20020_M0> -> 193
43 <- 193 < T162_U20279_M0> -> 188
193 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 62
131 <- 62 < T121_U20226_M0> -> 160
62 <- 160 < T148_U20238_M0> -> 183
160 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 92
152 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 146
113 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 76
166 <- 76 < T20_U20221_M0> -> 89
76 <- 89 < T12_U20220_M0> -> 138
89 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 59
74 <- 59 < T25_U20202_M0> -> 56
59 <- 56 < T35_U20203_M0> -> 45
56 <- 45 < T76_U20199_M0> -> 32
45 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 39
109 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 79
171 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 148
73 <- 148 < T123_U20185_M0> -> 50
148 <- 50 < T80_U19969_M0> -> 147
50 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 153
170 <- 153 < T129_U20184_M0> -> 129
153 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 54
144 <- 54 < T23_U20175_M0> -> 58
54 <- 58 < T77_U20172_M0> -> 120
58 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 228
227 <- 228 < T197_U20347_M0> -> 251
228 <- 251 < T220_U20382_M0> -> 253
251 <- 253 < T222_U20386_M0> -> 254
253 <- 254 < T223_U20387_M0> -> 255
254 <- 255 < T224_U20389_M0> -> 249
255 <- 249 < T216_U20391_M0> -> 244
249 <- 244 < T218_U20394_M0> -> 248
244 <- 248 < T214_U20395_M0> -> 246
248 <- 246 < T215_U20400_M0> -> 245
246 <- 245 < T213_U20402_M0> -> 247
245 <- 247 < T211_U20403_M0> -> 243
247 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 229
256 <- 229 < T198_U20411_M0> -> 257
229 <- 257 < T226_U20412_M0> -> 259
257 <- 259 < T228_U20415_M0> -> 260
259 <- 260 < T229_U20416_M0> -> 261
260 <- 261 < T230_U20417_M0> -> 262
261 <- 262 < T231_U20419_M0> -> 263
262 <- 263 < T232_U20424_M0> -> 264
263 <- 264 < T233_U20426_M0> -> 265
264 <- 265 < T234_U20427_M0> -> 266
265 <- 266 < T235_U20428_M0> -> 267
266 <- 267 < T236_U20429_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W0> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W2> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W5> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W6> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W7> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (22 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20246_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T156_U20224_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20336_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T198_U20411_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T199_U20353_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20368_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20371_M0> (2 requests, queue in use):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T212_U20408_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20402_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T215_U20400_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T211_U20403_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T214_U20395_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T216_U20391_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T219_U20381_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T228_U20415_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T229_U20416_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T230_U20417_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T231_U20419_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T232_U20424_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T233_U20426_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T234_U20427_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T235_U20428_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T236_U20429_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 04:48:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 182
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |4 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |210|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10334|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |high|T71_U20423_M0 |INTERNAL | | |
9| |000|SAPSYS | |
|
| 4|19804 |DIA |WP_RUN | | |norm|T210_U20371_M0 |HTTP_NORM| | |
3| | | | |
|
| 5| |DIA |WP_KILL| |4 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |5 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |4 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 04:48:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|9 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|9 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|2 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|10 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|10 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|INTERNAL |T71_U20423_M0 |000|SAPSYS | |04:48:34|3 |
|high| | | |
4200|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|2 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T75_U20246_M0 | | |10.54.36.35 |04:41:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|2 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|22 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|2 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|2 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|9 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|2 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|2 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|2 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|2 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|2 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|2 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|2 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|6 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|2 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|2 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|3 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|2 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|3 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|2 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|2 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|2 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|14 | | |
0|
|HTTP_NORMAL |T191_U20336_M0 | | |10.50.47.13 |04:45:11| |
|norm|2 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|3 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|9 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|14 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T208_U20368_M0 | | |10.54.36.37 |04:46:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T210_U20371_M0 | | |10.54.36.26 |04:48:40|4 |
|norm|2 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|4 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|2 | | |
0|
|HTTP_NORMAL |T213_U20402_M0 | | |10.54.36.13 |04:47:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T214_U20395_M0 | | |10.50.47.13 |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|1 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|1 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|3 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|2 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|4 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|2 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|4 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|1 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|1 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|1 | | |
0|

Found 236 logons with 236 sessions


Total ES (gross) memory of all sessions: 79 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

Force ABAP stack dump of session T38_U20216_M0


Force ABAP stack dump of session T61_U20113_M0
Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T190_U20350_M0

Sun Sep 22 04:48:43:817 2019


Force ABAP stack dump of session T200_U20354_M0

RFC-Connection Table (49 entries) Sun Sep 22 04:48:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 49 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
361 INVALID -1
362 INVALID -1
363 INVALID -1
364 INVALID -1
365 INVALID -1
366 INVALID -1
367 INVALID -1
368 INVALID -1
369 INVALID -1
370 INVALID -1
371 INVALID -1
372 INVALID -1
373 INVALID -1
374 INVALID -1
375 INVALID -1
45 ca_blk slots of 6000 in use, 43 currently unowned (in request queues)

MPI Info Sun Sep 22 04:48:43 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 219 / 1884
Periodic Tasks Sun Sep 22 04:48:43 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5703| 77| |
|
| 1|DDLOG | 5703| 77| |
|
| 2|BTCSCHED | 11405| 21| |
|
| 3|RESTART_ALL | 2281| 73| |
|
| 4|ENVCHECK | 34225| 20| |
|
| 5|AUTOABAP | 2281| 73| |
|
| 6|BGRFC_WATCHDOG | 2282| 73| |
|
| 7|AUTOTH | 288| 21| |
|
| 8|AUTOCCMS | 11405| 21| |
|
| 9|AUTOSECURITY | 11404| 21| |
|
| 10|LOAD_CALCULATION | 683559| 1| |
|
| 11|SPOOLALRM | 11408| 21| |
|
| 12|CALL_DELAYED | 0| 144| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 182 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:48:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:48:53:417 2019


DpHdlSoftCancel: cancel request for T224_U20389_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T222_U20386_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:48:57:243 2019


DpHdlSoftCancel: cancel request for T71_U20431_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:49:00:796 2019


DpHdlSoftCancel: cancel request for T216_U20391_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:49:03:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:49:04:140 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:49:15:815 2019


DpHdlSoftCancel: cancel request for T214_U20395_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:49:23:806 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:49:33:832 2019


DpHdlSoftCancel: cancel request for T215_U20400_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T213_U20402_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:49:38:835 2019


DpHdlSoftCancel: cancel request for T211_U20403_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:49:43:807 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 26928

Sun Sep 22 04:49:47:303 2019


DpHdlSoftCancel: ignore cancel for invalid T243_M0

Sun Sep 22 04:50:00:913 2019


DpHdlSoftCancel: cancel request for T226_U20412_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T198_U20411_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: ignore cancel for T244_U20459_M0

Sun Sep 22 04:50:03:807 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-27197
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-27198
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-27199
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-27200
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-27201
DpHdlSoftCancel: cancel request for T228_U20415_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:50:04:141 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:50:04:912 2019


*** ERROR => DpHdlDeadWp: W0 (pid 27197) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27197) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 27197)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 27198) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27198) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 27198)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 27199) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27199) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 27199)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 27200) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27200) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 27200)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 27201) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=27201) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 27201)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:50:08:854 2019


DpHdlSoftCancel: cancel request for T231_U20419_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T229_U20416_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T230_U20417_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:50:19:244 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:50:23:808 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 26928 terminated

Sun Sep 22 04:50:27:360 2019


DpHdlSoftCancel: delete in progress for T245_U20445_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:50:33:869 2019


DpHdlSoftCancel: cancel request for T233_U20426_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T234_U20427_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T232_U20424_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:50:38:874 2019


DpHdlSoftCancel: cancel request for T235_U20428_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T197_U20347_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:50:41:001 2019


DpHdlSoftCancel: cancel request for T236_U20429_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: delete in progress for T246_U20446_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:50:43:809 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:50:53:888 2019


DpHdlSoftCancel: cancel request for T239_U20436_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T227_U20432_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T237_U20433_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:51:03:810 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:51:03:989 2019


DpHdlSoftCancel: cancel request for T210_U20438_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T241_U20440_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:51:04:142 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:51:08:994 2019


DpHdlSoftCancel: cancel request for T242_U20441_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:51:17:320 2019


DpHdlSoftCancel: cancel request for T247_U20447_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:51:23:810 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:51:27:411 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:51:33:331 2019


DpHdlSoftCancel: cancel request for T71_U20451_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:51:43:810 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:51:53:349 2019


DpHdlSoftCancel: cancel request for T240_U20455_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:52:03:811 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:52:04:142 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:52:06:125 2019


DpHdlSoftCancel: cancel request for T249_U20458_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:52:23:812 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:52:33:142 2019


DpHdlSoftCancel: cancel request for T244_U20464_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:52:38:147 2019


DpHdlSoftCancel: cancel request for T243_U20466_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T250_U20467_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T251_U20468_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:52:40:781 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:52:43:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:52:49:368 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.928915 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.928727 /
0.000000

Sun Sep 22 04:52:53:162 2019


DpHdlSoftCancel: cancel request for T255_U20475_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T256_U20476_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:52:53:371 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 04:52:58:168 2019


DpHdlSoftCancel: cancel request for T257_U20478_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T258_U20479_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:53:00:797 2019


DpHdlSoftCancel: cancel request for T260_U20481_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T259_U20480_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:53:03:813 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:53:04:143 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:53:05:802 2019


DpHdlSoftCancel: cancel request for T289_U20532_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T265_U20488_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T266_U20489_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T267_U20490_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T262_U20484_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T264_U20486_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T263_U20485_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T261_U20483_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:53:15:809 2019


DpHdlSoftCancel: cancel request for T268_U20492_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:53:23:814 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:53:33:826 2019


DpHdlSoftCancel: cancel request for T269_U20497_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T270_U20498_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:53:38:831 2019


DpHdlSoftCancel: cancel request for T288_U20531_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:53:40:832 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:53:43:814 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
DpHdlSoftCancel: cancel request for T272_U20501_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:54:00:847 2019


DpHdlSoftCancel: cancel request for T245_U20504_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:54:03:815 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:54:04:144 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:54:23:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:49:43 2019, skip new
snapshot

Sun Sep 22 04:54:28:863 2019


DpHdlSoftCancel: cancel request for T275_U20514_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:54:33:869 2019


DpHdlSoftCancel: cancel request for T285_U20525_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T284_U20523_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:54:38:872 2019


DpHdlSoftCancel: cancel request for T286_U20526_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T287_U20527_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:54:40:785 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:54:43:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 183 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:54:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 04:54:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 2
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 04:54:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 912 (peak 916, writeCount 24104383, readCount 24103471)


UPD : 0 (peak 31, writeCount 4957, readCount 4957)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125393, readCount 2125393)
SPO : 0 (peak 2, writeCount 25094, readCount 25094)
UP2 : 0 (peak 1, writeCount 2343, readCount 2343)
DISP: 0 (peak 67, writeCount 889845, readCount 889845)
GW : 0 (peak 49, writeCount 22411019, readCount 22411019)
ICM : 1 (peak 186, writeCount 391058, readCount 391057)
LWP : 6 (peak 16, writeCount 38234, readCount 38228)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 291 elements, peak 292):
-1 <- 317 < T286_U20526_M0> -> 315
317 <- 315 < T284_U20523_M0> -> 316
315 <- 316 < T285_U20525_M0> -> 306
316 <- 306 < T275_U20514_M0> -> 213
306 <- 213 < T182_U20314_M0> -> 216
213 <- 216 < T185_U20317_M0> -> 212
216 <- 212 < T181_U20313_M0> -> 215
212 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 210
211 <- 210 < T179_U20309_M0> -> 276
210 <- 276 < T245_U20504_M0> -> 206
276 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 205
208 <- 205 < T174_U20300_M0> -> 203
205 <- 203 < T172_U20298_M0> -> 204
203 <- 204 < T173_U20299_M0> -> 303
204 <- 303 < T272_U20501_M0> -> 202
303 <- 202 < T171_U20295_M0> -> 301
202 <- 301 < T270_U20498_M0> -> 300
301 <- 300 < T269_U20497_M0> -> 201
300 <- 201 < T170_U20291_M0> -> 200
201 <- 200 < T169_U20289_M0> -> 199
200 <- 199 < T168_U20288_M0> -> 299
199 <- 299 < T268_U20492_M0> -> 196
299 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 103
195 <- 103 < T145_U20282_M0> -> 292
103 <- 292 < T261_U20483_M0> -> 294
292 <- 294 < T263_U20485_M0> -> 295
294 <- 295 < T264_U20486_M0> -> 293
295 <- 293 < T262_U20484_M0> -> 298
293 <- 298 < T267_U20490_M0> -> 297
298 <- 297 < T266_U20489_M0> -> 296
297 <- 296 < T265_U20488_M0> -> 320
296 <- 320 < T289_U20532_M0> -> 290
320 <- 290 < T259_U20480_M0> -> 291
290 <- 291 < T260_U20481_M0> -> 193
291 <- 193 < T162_U20279_M0> -> 91
193 <- 91 < T19_U20278_M0> -> 289
91 <- 289 < T258_U20479_M0> -> 288
289 <- 288 < T257_U20478_M0> -> 287
288 <- 287 < T256_U20476_M0> -> 286
287 <- 286 < T255_U20475_M0> -> 189
286 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 281
191 <- 281 < T250_U20467_M0> -> 274
281 <- 274 < T243_U20466_M0> -> 275
274 <- 275 < T244_U20464_M0> -> 182
275 <- 182 < T151_U20265_M0> -> 176
182 <- 176 < T157_U20264_M0> -> 280
176 <- 280 < T249_U20458_M0> -> 190
280 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 271
158 <- 271 < T240_U20455_M0> -> 184
271 <- 184 < T135_U20255_M0> -> 180
184 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 135
154 <- 135 < T71_U20451_M0> -> 188
135 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 62
131 <- 62 < T121_U20226_M0> -> 160
62 <- 160 < T148_U20238_M0> -> 278
160 <- 278 < T247_U20447_M0> -> 175
278 <- 175 < T144_U20241_M0> -> 183
175 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 273
152 <- 273 < T242_U20441_M0> -> 146
273 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 272
166 <- 272 < T241_U20440_M0> -> 241
272 <- 241 < T210_U20438_M0> -> 76
241 <- 76 < T20_U20221_M0> -> 89
76 <- 89 < T12_U20220_M0> -> 268
89 <- 268 < T237_U20433_M0> -> 258
268 <- 258 < T227_U20432_M0> -> 100
258 <- 100 < T38_U20216_M0> -> 270
100 <- 270 < T239_U20436_M0> -> 138
270 <- 138 < T106_U20213_M0> -> 267
138 <- 267 < T236_U20429_M0> -> 133
267 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 228
157 <- 228 < T197_U20347_M0> -> 266
228 <- 266 < T235_U20428_M0> -> 263
266 <- 263 < T232_U20424_M0> -> 265
263 <- 265 < T234_U20427_M0> -> 264
265 <- 264 < T233_U20426_M0> -> 74
264 <- 74 < T42_U20205_M0> -> 56
74 <- 56 < T35_U20203_M0> -> 59
56 <- 59 < T25_U20202_M0> -> 261
59 <- 261 < T230_U20417_M0> -> 260
261 <- 260 < T229_U20416_M0> -> 262
260 <- 262 < T231_U20419_M0> -> 45
262 <- 45 < T76_U20199_M0> -> 259
45 <- 259 < T228_U20415_M0> -> 229
259 <- 229 < T198_U20411_M0> -> 257
229 <- 257 < T226_U20412_M0> -> 39
257 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 247
171 <- 247 < T211_U20403_M0> -> 245
247 <- 245 < T213_U20402_M0> -> 246
245 <- 246 < T215_U20400_M0> -> 248
246 <- 248 < T214_U20395_M0> -> 148
248 <- 148 < T123_U20185_M0> -> 153
148 <- 153 < T129_U20184_M0> -> 147
153 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 249
170 <- 249 < T216_U20391_M0> -> 129
249 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 54
144 <- 54 < T23_U20175_M0> -> 253
54 <- 253 < T222_U20386_M0> -> 255
253 <- 255 < T224_U20389_M0> -> 58
255 <- 58 < T77_U20172_M0> -> 231
58 <- 231 < T200_U20354_M0> -> 221
231 <- 221 < T190_U20350_M0> -> 78
221 <- 78 < T88_U20160_M0> -> 69
78 <- 69 < T61_U20113_M0> -> 252
69 <- 252 < T221_U20383_M0> -> 250
252 <- 250 < T219_U20381_M0> -> 119
250 <- 119 < T11_U20406_M0> -> 117
119 <- 117 < T31_U20168_M0> -> 239
117 <- 239 < T208_U20368_M0> -> 240
239 <- 240 < T209_U20370_M0> -> 60
240 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 237
124 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 242
238 <- 242 < T212_U20408_M0> -> 163
242 <- 163 < T133_U20159_M0> -> 236
163 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 230
233 <- 230 < T199_U20353_M0> -> 70
230 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 192
114 <- 192 < T161_U20349_M0> -> 118
192 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 226
105 <- 226 < T195_U20345_M0> -> 225
226 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 125
224 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 222
42 <- 222 < T191_U20336_M0> -> 94
222 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 220
167 <- 220 < T189_U20331_M0> -> 219
220 <- 219 < T188_U20326_M0> -> 218
219 <- 218 < T187_U20325_M0> -> 93
218 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 31
132 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 92
43 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 32
113 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 79
109 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 50
73 <- 50 < T80_U19969_M0> -> 120
50 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 251
227 <- 251 < T220_U20382_M0> -> 254
251 <- 254 < T223_U20387_M0> -> 244
254 <- 244 < T218_U20394_M0> -> 243
244 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 269
256 <- 269 < T238_U20435_M0> -> 279
269 <- 279 < T248_U20456_M0> -> 283
279 <- 283 < T252_U20469_M0> -> 284
283 <- 284 < T253_U20470_M0> -> 285
284 <- 285 < T254_U20474_M0> -> 302
285 <- 302 < T271_U20500_M0> -> 304
302 <- 304 < T273_U20507_M0> -> 305
304 <- 305 < T274_U20509_M0> -> 277
305 <- 277 < T246_U20530_M0> -> 321
277 <- 321 < T290_U20533_M0> -> 313
321 <- 313 < T281_U20535_M0> -> 314
313 <- 314 < T278_U20536_M0> -> 308
314 <- 308 < T276_U20537_M0> -> 307
308 <- 307 < T277_U20538_M0> -> 312
307 <- 312 < T283_U20541_M0> -> 311
312 <- 311 < T279_U20542_M0> -> 309
311 <- 309 < T280_U20543_M0> -> 310
309 <- 310 < T282_U20544_M0> -> 322
310 <- 322 < T291_U20546_M0> -> 323
322 <- 323 < T292_U20547_M0> -> 324
323 <- 324 < T293_U20548_M0> -> 325
324 <- 325 < T294_U20553_M0> -> 326
325 <- 326 < T295_U20554_M0> -> 327
326 <- 327 < T296_U20556_M0> -> 165
327 <- 165 < T75_U20558_M0> -> 282
165 <- 282 < T251_U20559_M0> -> 328
282 <- 328 < T297_U20560_M0> -> 329
328 <- 329 < T298_U20562_M0> -> 330
329 <- 330 < T299_U20563_M0> -> 332
330 <- 332 < T301_U20567_M0> -> 333
332 <- 333 < T302_U20568_M0> -> 334
333 <- 334 < T303_U20574_M0> -> 335
334 <- 335 < T304_U20575_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <IcmanQueue> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_CREATE_SNAPSHOT
Requests in queue <W0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W2> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W5> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W6> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W7> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (23 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20451_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T156_U20224_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20336_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T198_U20411_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T199_U20353_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20368_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20438_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T212_U20408_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20402_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T215_U20400_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T211_U20403_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T214_U20395_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T216_U20391_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T219_U20381_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T227_U20432_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T228_U20415_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T229_U20416_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T230_U20417_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T231_U20419_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T232_U20424_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T233_U20426_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T234_U20427_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T235_U20428_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T236_U20429_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T237_U20433_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T238_U20435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T239_U20436_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T240_U20455_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T241_U20440_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T242_U20441_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T243_U20466_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T244_U20464_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T245_U20504_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T246_U20530_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T247_U20447_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T248_U20456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T249_U20458_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T250_U20467_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T251_U20559_M0> (13 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T252_U20469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T253_U20470_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T254_U20474_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T255_U20475_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T256_U20476_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T257_U20478_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T258_U20479_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T259_U20480_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T260_U20481_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T261_U20483_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T262_U20484_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T263_U20485_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T264_U20486_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T265_U20488_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T266_U20489_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T267_U20490_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T268_U20492_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T269_U20497_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T270_U20498_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T271_U20500_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T272_U20501_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T273_U20507_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T274_U20509_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T275_U20514_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T277_U20538_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T276_U20537_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T280_U20543_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T282_U20544_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T279_U20542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T283_U20541_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T281_U20535_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T278_U20536_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T284_U20523_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T285_U20525_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T286_U20526_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T287_U20527_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T289_U20532_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T290_U20533_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T291_U20546_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T292_U20547_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T293_U20548_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T294_U20553_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T295_U20554_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T296_U20556_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T297_U20560_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T298_U20562_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T299_U20563_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T301_U20567_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T302_U20568_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T303_U20574_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T304_U20575_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 04:54:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 183
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |5 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |211|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10370|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |norm|T287_U20527_M0 |HTTP_NORM| | |
3| | | | |
|
| 4|19804 |DIA |WP_RUN | | |high|T300_U20576_M0 |INTERNAL | | |
3| | | | |
|
| 5| |DIA |WP_KILL| |5 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |6 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |5 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 04:54:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|10 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|10 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|3 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|12 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|11 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T71_U20451_M0 | | |10.54.36.35 |04:49:29| |
|norm|2 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|3 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|SYNC_RFC |T75_U20558_M0 | | |smprd02.niladv.org |04:53:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|23 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|3 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|10 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|3 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|3 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|3 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|3 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|3 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|3 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|4 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|4 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|3 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|3 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|15 | | |
0|
|HTTP_NORMAL |T191_U20336_M0 | | |10.50.47.13 |04:45:11| |
|norm|2 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|3 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|10 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|15 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T208_U20368_M0 | | |10.54.36.37 |04:46:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T210_U20438_M0 | | |10.54.36.15 |04:49:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|5 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|2 | | |
0|
|HTTP_NORMAL |T213_U20402_M0 | | |10.54.36.13 |04:47:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T214_U20395_M0 | | |10.50.47.13 |04:47:11| |
|norm|2 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|2 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|2 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|3 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|2 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|5 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|3 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|5 | | |
0|
|HTTP_NORMAL |T227_U20432_M0 | | |10.54.36.32 |04:48:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|2 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|2 | | |
0|
|HTTP_NORMAL |T237_U20433_M0 | | |10.54.36.27 |04:48:50| |
|norm|2 | | |
0|
|SYNC_RFC |T238_U20435_M0 | | |smprd02.niladv.org |04:48:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T239_U20436_M0 | | |10.50.47.10 |04:48:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T240_U20455_M0 | | |10.54.36.29 |04:49:48| |
|norm|4 | | |
0|
|HTTP_NORMAL |T241_U20440_M0 | | |10.54.36.25 |04:49:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T242_U20441_M0 | | |10.54.36.36 |04:49:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T243_U20466_M0 | | |10.54.36.37 |04:50:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T244_U20464_M0 | | |10.54.36.13 |04:50:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T245_U20504_M0 | | |10.54.36.33 |04:51:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T246_U20530_M0 | | |10.54.36.29 |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T247_U20447_M0 | | |10.50.47.13 |04:49:12| |
|norm|3 | | |
0|
|SYNC_RFC |T248_U20456_M0 | | |smprd02.niladv.org |04:49:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T249_U20458_M0 | | |10.54.36.33 |04:49:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T250_U20467_M0 | | |10.54.36.28 |04:50:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T251_U20559_M0 | | |10.54.36.32 |04:53:50| |
|norm|13 | | |
0|
|SYNC_RFC |T252_U20469_M0 | | |smprd02.niladv.org |04:50:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T253_U20470_M0 | | |10.54.36.29 |04:50:38| |
|norm|9 | | |
0|
|SYNC_RFC |T254_U20474_M0 | | |smprd02.niladv.org |04:50:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T255_U20475_M0 | | |10.54.36.32 |04:50:50| |
|norm|14 | | |
0|
|HTTP_NORMAL |T256_U20476_M0 | | |10.54.36.27 |04:50:51| |
|norm|4 | | |
0|
|HTTP_NORMAL |T257_U20478_M0 | | |10.50.47.10 |04:50:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T258_U20479_M0 | | |10.54.36.40 |04:50:56| |
|norm|2 | | |
0|
|HTTP_NORMAL |T259_U20480_M0 | | |10.54.36.34 |04:50:57| |
|norm|5 | | |
0|
|HTTP_NORMAL |T260_U20481_M0 | | |10.54.36.19 |04:50:58| |
|norm|14 | | |
0|
|HTTP_NORMAL |T261_U20483_M0 | | |10.54.36.15 |04:51:01| |
|norm|2 | | |
0|
|HTTP_NORMAL |T262_U20484_M0 | | |10.54.36.12 |04:51:01| |
|norm|2 | | |
0|
|HTTP_NORMAL |T263_U20485_M0 | | |10.54.36.17 |04:51:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T264_U20486_M0 | | |10.54.36.38 |04:51:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T265_U20488_M0 | | |10.54.36.25 |04:51:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T266_U20489_M0 | | |10.54.36.30 |04:51:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T267_U20490_M0 | | |10.54.36.36 |04:51:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T268_U20492_M0 | | |10.50.47.13 |04:51:12| |
|norm|2 | | |
0|
|HTTP_NORMAL |T269_U20497_M0 | | |10.54.36.26 |04:51:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T270_U20498_M0 | | |10.54.36.11 |04:51:32| |
|norm|2 | | |
0|
|SYNC_RFC |T271_U20500_M0 | | |smprd02.niladv.org |04:51:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T272_U20501_M0 | | |10.54.36.41 |04:51:39| |
|norm|2 | | |
0|
|SYNC_RFC |T273_U20507_M0 | | |smprd02.niladv.org |04:52:01| |
|norm|1 | | |
0|
|SYNC_RFC |T274_U20509_M0 | | |smprd02.niladv.org |04:52:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T275_U20514_M0 | | |10.54.36.35 |04:52:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T276_U20537_M0 | | |10.54.36.34 |04:52:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T277_U20538_M0 | | |10.54.36.19 |04:52:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T278_U20536_M0 | | |10.50.47.10 |04:52:53| |
|norm|1 | | |
0|
|SYNC_RFC |T279_U20542_M0 | | |smprd02.niladv.org |04:53:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T280_U20543_M0 | | |10.54.36.12 |04:53:01| |
|norm|4 | | |
0|
|SYNC_RFC |T281_U20535_M0 | | |smprd02.niladv.org |04:52:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T282_U20544_M0 | | |10.54.36.17 |04:53:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T283_U20541_M0 | | |10.54.36.15 |04:53:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T284_U20523_M0 | | |10.54.36.13 |04:52:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T285_U20525_M0 | | |10.54.36.37 |04:52:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T286_U20526_M0 | | |10.54.36.28 |04:52:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T287_U20527_M0 | | |10.54.36.14 |04:54:40|3 |
|norm|1 | | |
0|
|HTTP_NORMAL |T289_U20532_M0 | | |10.54.36.29 |04:52:48| |
|norm|2 | | |
0|
|SYNC_RFC |T290_U20533_M0 | | |smprd02.niladv.org |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T291_U20546_M0 | | |10.54.36.38 |04:53:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T292_U20547_M0 | | |10.54.36.30 |04:53:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T293_U20548_M0 | | |10.50.47.13 |04:53:12| |
|norm|1 | | |
0|
|HTTP_NORMAL |T294_U20553_M0 | | |10.54.36.26 |04:53:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T295_U20554_M0 | | |10.54.36.11 |04:53:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T296_U20556_M0 | | |10.54.36.41 |04:53:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T297_U20560_M0 | | |10.54.36.27 |04:53:50| |
|norm|3 | | |
0|
|SYNC_RFC |T298_U20562_M0 | | |smprd02.niladv.org |04:53:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T299_U20563_M0 | | |10.54.36.33 |04:53:58| |
|norm|1 | | |
0|
|INTERNAL |T300_U20576_M0 | | | |04:54:40|4 |
|high| | | |
0|
|HTTP_NORMAL |T301_U20567_M0 | | |10.54.36.25 |04:54:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T302_U20568_M0 | | |10.54.36.36 |04:54:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T303_U20574_M0 | | |10.54.36.37 |04:54:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T304_U20575_M0 | | |10.54.36.14 |04:54:37| |
|norm|1 | | |
0|

Found 304 logons with 304 sessions


Total ES (gross) memory of all sessions: 75 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

Force ABAP stack dump of session T12_U20220_M0


Force ABAP stack dump of session T23_U20175_M0
Force ABAP stack dump of session T38_U20216_M0
Force ABAP stack dump of session T61_U20113_M0

Sun Sep 22 04:54:43:832 2019


Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T121_U20226_M0
Force ABAP stack dump of session T190_U20350_M0
Force ABAP stack dump of session T197_U20347_M0
Force ABAP stack dump of session T200_U20354_M0
Force ABAP stack dump of session T251_U20559_M0
Force ABAP stack dump of session T255_U20475_M0
Force ABAP stack dump of session T260_U20481_M0

RFC-Connection Table (61 entries) Sun Sep 22 04:54:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 32|53831465|53831465SU20509_M0 |T274_U20509_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 57|53869968|53869968SU20533_M0 |T290_U20533_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 73|53627349|53627349SU20435_M0 |T238_U20435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 99|53683971|53683971SU20456_M0 |T248_U20456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 100|53884618|53884618SU20542_M0 |T279_U20542_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 101|53875871|53875871SU20535_M0 |T281_U20535_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 138|53820547|53820547SU20507_M0 |T273_U20507_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 170|53796298|53796298SU20500_M0 |T271_U20500_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 193|53933040|53933040SU20558_M0 |T75_U20558_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 205|53746045|53746045SU20474_M0 |T254_U20474_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 252|53938941|53938941SU20562_M0 |T298_U20562_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 255|53734228|53734228SU20469_M0 |T252_U20469_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 61 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
2 WORKER 19804
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
361 INVALID -1
362 INVALID -1
363 INVALID -1
364 INVALID -1
365 INVALID -1
366 INVALID -1
367 INVALID -1
368 INVALID -1
369 INVALID -1
370 INVALID -1
371 INVALID -1
372 INVALID -1
373 INVALID -1
374 INVALID -1
375 INVALID -1
376 INVALID -1
377 INVALID -1
378 INVALID -1
379 INVALID -1
380 INVALID -1
381 INVALID -1
382 INVALID -1
383 INVALID -1
384 INVALID -1
385 INVALID -1
386 INVALID -1
387 INVALID -1
58 ca_blk slots of 6000 in use, 55 currently unowned (in request queues)

MPI Info Sun Sep 22 04:54:43 2019


------------------------------------------------------------
Current pipes in use: 201
Current / maximal blocks in use: 233 / 1884

Periodic Tasks Sun Sep 22 04:54:43 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5706| 77| |
|
| 1|DDLOG | 5706| 77| |
|
| 2|BTCSCHED | 11410| 21| |
|
| 3|RESTART_ALL | 2282| 13| |
|
| 4|ENVCHECK | 34243| 20| |
|
| 5|AUTOABAP | 2282| 13| |
|
| 6|BGRFC_WATCHDOG | 2283| 13| |
|
| 7|AUTOTH | 294| 21| |
|
| 8|AUTOCCMS | 11410| 21| |
|
| 9|AUTOSECURITY | 11409| 21| |
|
| 10|LOAD_CALCULATION | 683918| 1| |
|
| 11|SPOOLALRM | 11414| 21| |
|
| 12|CALL_DELAYED | 0| 1241| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 183 (Reason: Workprocess 0 died / Time: Sun Sep 22
04:54:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:54:53:825 2019


DpHdlSoftCancel: cancel request for T246_U20530_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:54:59:313 2019


DpHdlSoftCancel: cancel request for T278_U20536_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T276_U20537_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:55:00:801 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
DpHdlSoftCancel: cancel request for T277_U20538_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:55:03:816 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W0-28601
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-28602
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W5-28603
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W6-28604
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W7-28605

Sun Sep 22 04:55:04:144 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:55:04:846 2019


*** ERROR => DpHdlDeadWp: W0 (pid 28601) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28601) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 28601)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 28602) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28602) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 28602)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 28603) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28603) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 28603)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 28604) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28604) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 28604)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 28605) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=28605) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 28605)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:55:05:899 2019


DpHdlSoftCancel: cancel request for T292_U20547_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T282_U20544_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T283_U20541_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T291_U20546_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T280_U20543_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:55:17:749 2019


DpHdlSoftCancel: cancel request for T293_U20548_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:55:23:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 04:55:24:161 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:55:33:921 2019


DpHdlSoftCancel: cancel request for T295_U20554_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T294_U20553_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:55:38:926 2019


DpHdlSoftCancel: cancel request for T253_U20470_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:55:43:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 28779

Sun Sep 22 04:55:43:929 2019


DpHdlSoftCancel: cancel request for T296_U20556_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:55:58:190 2019


DpHdlSoftCancel: cancel request for T297_U20560_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T251_U20559_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:56:03:194 2019


DpHdlSoftCancel: cancel request for T299_U20563_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
Sun Sep 22 04:56:03:817 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:56:04:146 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
DpHdlSoftCancel: cancel request for T301_U20567_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T302_U20568_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:56:22:702 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:56:23:817 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 28779 terminated
Sun Sep 22 04:56:24:496 2019
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:56:38:222 2019


DpHdlSoftCancel: cancel request for T304_U20575_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T303_U20574_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:56:43:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:56:44:512 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:56:53:233 2019


DpHdlSoftCancel: cancel request for T288_U20578_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:56:58:238 2019


DpHdlSoftCancel: cancel request for T305_U20580_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:57:03:243 2019


DpHdlSoftCancel: cancel request for T287_U20582_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T306_U20581_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:57:03:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:57:04:146 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:57:09:529 2019


DpHdlSoftCancel: cancel request for T307_U20585_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:57:18:020 2019


DpHdlSoftCancel: cancel request for T309_U20587_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:57:23:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:57:24:271 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:57:33:032 2019


DpHdlSoftCancel: cancel request for T310_U20592_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T300_U20591_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:57:38:033 2019


DpHdlSoftCancel: cancel request for T311_U20594_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T312_U20595_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:57:43:819 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:58:03:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:58:03:994 2019


DpHdlSoftCancel: cancel request for T330_U20639_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T315_U20602_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T318_U20606_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T316_U20603_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T317_U20605_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:58:04:147 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:58:23:820 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:58:33:331 2019


DpHdlSoftCancel: cancel request for T319_U20613_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:58:38:332 2019


DpHdlSoftCancel: cancel request for T214_U20638_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:58:40:799 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:58:43:337 2019


DpHdlSoftCancel: cancel request for T321_U20616_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:58:43:821 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:58:48:823 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.900483 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.900478 /
0.000000

Sun Sep 22 04:58:52:828 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 04:58:53:346 2019


DpHdlSoftCancel: cancel request for T191_U20619_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T323_U20621_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T324_U20622_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:59:03:354 2019


DpHdlSoftCancel: cancel request for T327_U20627_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T326_U20626_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T325_U20624_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:59:03:821 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:59:04:147 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:59:13:362 2019


DpHdlSoftCancel: cancel request for T329_U20632_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:59:20:832 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 04:59:23:822 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:59:33:379 2019


DpHdlSoftCancel: cancel request for T208_U20635_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:59:38:383 2019


DpHdlSoftCancel: cancel request for T296_U20636_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:59:40:868 2019


DpHdlSoftCancel: ignore cancel for invalid T343_M0

Sun Sep 22 04:59:43:823 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 04:59:44:401 2019


DpHdlSoftCancel: delete in progress for T341_U20682_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 04:59:58:400 2019


DpHdlSoftCancel: cancel request for T332_U20642_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:00:900 2019


DpHdlSoftCancel: cancel request for T333_U20643_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:03:403 2019


DpHdlSoftCancel: cancel request for T335_U20645_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:03:823 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-30480
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-30481
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-30482
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-30483
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-30484

Sun Sep 22 05:00:04:148 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:00:04:432 2019


DpHdlSoftCancel: delete in progress for T352_U20684_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:04:942 2019


*** ERROR => DpHdlDeadWp: W0 (pid 30480) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30480) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 30480)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 30481) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30481) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 30481)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 30482) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30482) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 30482)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 30483) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30483) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 30483)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 30484) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=30484) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 30484)
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 05:00:05:949 2019


DpUpdateStatusFileWith: state=YELLOW, reason=Request handling without progress
*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (1. check) [dpxxwp.c 4705]

Sun Sep 22 05:00:06:950 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (2. check) [dpxxwp.c 4705]

Sun Sep 22 05:00:07:951 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (3. check) [dpxxwp.c 4705]

Sun Sep 22 05:00:08:405 2019


DpHdlSoftCancel: cancel request for T336_U20648_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:08:952 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (4. check) [dpxxwp.c 4705]

Sun Sep 22 05:00:09:953 2019


*** WARNING => DpRequestProcessingCheck: potential request processing problem
detected (5. check) [dpxxwp.c 4705]

Sun Sep 22 05:00:20:918 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:00:23:824 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 04:55:43 2019, skip new
snapshot

Sun Sep 22 05:00:28:297 2019


DpHdlSoftCancel: cancel request for T337_U20653_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:33:302 2019


DpHdlSoftCancel: cancel request for T346_U20662_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:38:306 2019


DpHdlSoftCancel: cancel request for T347_U20664_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T348_U20665_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:40:937 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:00:43:311 2019


DpHdlSoftCancel: cancel request for T313_U20596_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:43:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 184 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:00:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 05:00:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 2
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 05:00:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 1152 (peak 1152, writeCount 24104720, readCount 24103568)


UPD : 0 (peak 31, writeCount 4958, readCount 4958)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125398, readCount 2125398)
SPO : 0 (peak 2, writeCount 25101, readCount 25101)
UP2 : 0 (peak 1, writeCount 2344, readCount 2344)
DISP: 0 (peak 67, writeCount 889944, readCount 889944)
GW : 0 (peak 49, writeCount 22411020, readCount 22411020)
ICM : 0 (peak 186, writeCount 391075, readCount 391075)
LWP : 1 (peak 16, writeCount 38248, readCount 38247)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 352 elements, peak 353):
-1 <- 344 < T313_U20596_M0> -> 275
344 <- 275 < T244_U20464_M0> -> 378
275 <- 378 < T347_U20664_M0> -> 377
378 <- 377 < T346_U20662_M0> -> 368
377 <- 368 < T337_U20653_M0> -> 367
368 <- 367 < T336_U20648_M0> -> 280
367 <- 280 < T249_U20458_M0> -> 366
280 <- 366 < T335_U20645_M0> -> 364
366 <- 364 < T333_U20643_M0> -> 271
364 <- 271 < T240_U20455_M0> -> 363
271 <- 363 < T332_U20642_M0> -> 135
363 <- 135 < T71_U20451_M0> -> 327
135 <- 327 < T296_U20636_M0> -> 239
327 <- 239 < T208_U20635_M0> -> 278
239 <- 278 < T247_U20447_M0> -> 360
278 <- 360 < T329_U20632_M0> -> 273
360 <- 273 < T242_U20441_M0> -> 272
273 <- 272 < T241_U20440_M0> -> 241
272 <- 241 < T210_U20438_M0> -> 270
241 <- 270 < T239_U20436_M0> -> 268
270 <- 268 < T237_U20433_M0> -> 258
268 <- 258 < T227_U20432_M0> -> 356
258 <- 356 < T325_U20624_M0> -> 357
356 <- 357 < T326_U20626_M0> -> 358
357 <- 358 < T327_U20627_M0> -> 355
358 <- 355 < T324_U20622_M0> -> 354
355 <- 354 < T323_U20621_M0> -> 222
354 <- 222 < T191_U20619_M0> -> 267
222 <- 267 < T236_U20429_M0> -> 265
267 <- 265 < T234_U20427_M0> -> 266
265 <- 266 < T235_U20428_M0> -> 264
266 <- 264 < T233_U20426_M0> -> 352
264 <- 352 < T321_U20616_M0> -> 263
352 <- 263 < T232_U20424_M0> -> 248
263 <- 248 < T214_U20638_M0> -> 350
248 <- 350 < T319_U20613_M0> -> 262
350 <- 262 < T231_U20419_M0> -> 261
262 <- 261 < T230_U20417_M0> -> 260
261 <- 260 < T229_U20416_M0> -> 229
260 <- 229 < T198_U20411_M0> -> 259
229 <- 259 < T228_U20415_M0> -> 257
259 <- 257 < T226_U20412_M0> -> 119
257 <- 119 < T11_U20406_M0> -> 348
119 <- 348 < T317_U20605_M0> -> 242
348 <- 242 < T212_U20408_M0> -> 347
242 <- 347 < T316_U20603_M0> -> 349
347 <- 349 < T318_U20606_M0> -> 346
349 <- 346 < T315_U20602_M0> -> 361
346 <- 361 < T330_U20639_M0> -> 247
361 <- 247 < T211_U20403_M0> -> 246
247 <- 246 < T215_U20400_M0> -> 343
246 <- 343 < T312_U20595_M0> -> 342
343 <- 342 < T311_U20594_M0> -> 331
342 <- 331 < T300_U20591_M0> -> 341
331 <- 341 < T310_U20592_M0> -> 340
341 <- 340 < T309_U20587_M0> -> 338
340 <- 338 < T307_U20585_M0> -> 255
338 <- 255 < T224_U20389_M0> -> 249
255 <- 249 < T216_U20391_M0> -> 337
249 <- 337 < T306_U20581_M0> -> 318
337 <- 318 < T287_U20582_M0> -> 253
318 <- 253 < T222_U20386_M0> -> 336
253 <- 336 < T305_U20580_M0> -> 319
336 <- 319 < T288_U20578_M0> -> 252
319 <- 252 < T221_U20383_M0> -> 240
252 <- 240 < T209_U20370_M0> -> 250
240 <- 250 < T219_U20381_M0> -> 334
250 <- 334 < T303_U20574_M0> -> 335
334 <- 335 < T304_U20575_M0> -> 237
335 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 236
238 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 231
233 <- 231 < T200_U20354_M0> -> 333
231 <- 333 < T302_U20568_M0> -> 332
333 <- 332 < T301_U20567_M0> -> 230
332 <- 230 < T199_U20353_M0> -> 330
230 <- 330 < T299_U20563_M0> -> 221
330 <- 221 < T190_U20350_M0> -> 192
221 <- 192 < T161_U20349_M0> -> 282
192 <- 282 < T251_U20559_M0> -> 328
282 <- 328 < T297_U20560_M0> -> 226
328 <- 226 < T195_U20345_M0> -> 228
226 <- 228 < T197_U20347_M0> -> 225
228 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 284
224 <- 284 < T253_U20470_M0> -> 325
284 <- 325 < T294_U20553_M0> -> 326
325 <- 326 < T295_U20554_M0> -> 324
326 <- 324 < T293_U20548_M0> -> 220
324 <- 220 < T189_U20331_M0> -> 309
220 <- 309 < T280_U20543_M0> -> 322
309 <- 322 < T291_U20546_M0> -> 312
322 <- 312 < T283_U20541_M0> -> 310
312 <- 310 < T282_U20544_M0> -> 323
310 <- 323 < T292_U20547_M0> -> 308
323 <- 308 < T276_U20537_M0> -> 314
308 <- 314 < T278_U20536_M0> -> 277
314 <- 277 < T246_U20530_M0> -> 219
277 <- 219 < T188_U20326_M0> -> 291
219 <- 291 < T260_U20481_M0> -> 286
291 <- 286 < T255_U20475_M0> -> 62
286 <- 62 < T121_U20226_M0> -> 78
62 <- 78 < T88_U20160_M0> -> 69
78 <- 69 < T61_U20113_M0> -> 100
69 <- 100 < T38_U20216_M0> -> 54
100 <- 54 < T23_U20175_M0> -> 89
54 <- 89 < T12_U20220_M0> -> 218
89 <- 218 < T187_U20325_M0> -> 317
218 <- 317 < T286_U20526_M0> -> 315
317 <- 315 < T284_U20523_M0> -> 316
315 <- 316 < T285_U20525_M0> -> 306
316 <- 306 < T275_U20514_M0> -> 213
306 <- 213 < T182_U20314_M0> -> 216
213 <- 216 < T185_U20317_M0> -> 212
216 <- 212 < T181_U20313_M0> -> 215
212 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 210
211 <- 210 < T179_U20309_M0> -> 276
210 <- 276 < T245_U20504_M0> -> 206
276 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 205
208 <- 205 < T174_U20300_M0> -> 203
205 <- 203 < T172_U20298_M0> -> 204
203 <- 204 < T173_U20299_M0> -> 303
204 <- 303 < T272_U20501_M0> -> 202
303 <- 202 < T171_U20295_M0> -> 301
202 <- 301 < T270_U20498_M0> -> 300
301 <- 300 < T269_U20497_M0> -> 201
300 <- 201 < T170_U20291_M0> -> 200
201 <- 200 < T169_U20289_M0> -> 199
200 <- 199 < T168_U20288_M0> -> 299
199 <- 299 < T268_U20492_M0> -> 196
299 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 103
195 <- 103 < T145_U20282_M0> -> 292
103 <- 292 < T261_U20483_M0> -> 294
292 <- 294 < T263_U20485_M0> -> 295
294 <- 295 < T264_U20486_M0> -> 293
295 <- 293 < T262_U20484_M0> -> 298
293 <- 298 < T267_U20490_M0> -> 297
298 <- 297 < T266_U20489_M0> -> 296
297 <- 296 < T265_U20488_M0> -> 320
296 <- 320 < T289_U20532_M0> -> 290
320 <- 290 < T259_U20480_M0> -> 193
290 <- 193 < T162_U20279_M0> -> 91
193 <- 91 < T19_U20278_M0> -> 289
91 <- 289 < T258_U20479_M0> -> 288
289 <- 288 < T257_U20478_M0> -> 287
288 <- 287 < T256_U20476_M0> -> 189
287 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 281
191 <- 281 < T250_U20467_M0> -> 274
281 <- 274 < T243_U20466_M0> -> 182
274 <- 182 < T151_U20265_M0> -> 176
182 <- 176 < T157_U20264_M0> -> 190
176 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 184
158 <- 184 < T135_U20255_M0> -> 180
184 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 188
154 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 160
131 <- 160 < T148_U20238_M0> -> 175
160 <- 175 < T144_U20241_M0> -> 183
175 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 146
152 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 76
166 <- 76 < T20_U20221_M0> -> 138
76 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 56
74 <- 56 < T35_U20203_M0> -> 59
56 <- 59 < T25_U20202_M0> -> 45
59 <- 45 < T76_U20199_M0> -> 39
45 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 148
171 <- 148 < T123_U20185_M0> -> 153
148 <- 153 < T129_U20184_M0> -> 147
153 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 129
170 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 58
144 <- 58 < T77_U20172_M0> -> 117
58 <- 117 < T31_U20168_M0> -> 60
117 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 163
124 <- 163 < T133_U20159_M0> -> 70
163 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 118
114 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 125
105 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 94
42 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 93
167 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 31
132 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 92
43 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 32
113 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 79
109 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 50
73 <- 50 < T80_U19969_M0> -> 120
50 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 251
227 <- 251 < T220_U20382_M0> -> 254
251 <- 254 < T223_U20387_M0> -> 244
254 <- 244 < T218_U20394_M0> -> 243
244 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 269
256 <- 269 < T238_U20435_M0> -> 279
269 <- 279 < T248_U20456_M0> -> 283
279 <- 283 < T252_U20469_M0> -> 285
283 <- 285 < T254_U20474_M0> -> 302
285 <- 302 < T271_U20500_M0> -> 304
302 <- 304 < T273_U20507_M0> -> 305
304 <- 305 < T274_U20509_M0> -> 321
305 <- 321 < T290_U20533_M0> -> 313
321 <- 313 < T281_U20535_M0> -> 311
313 <- 311 < T279_U20542_M0> -> 165
311 <- 165 < T75_U20558_M0> -> 329
165 <- 329 < T298_U20562_M0> -> 339
329 <- 339 < T308_U20586_M0> -> 345
339 <- 345 < T314_U20597_M0> -> 307
345 <- 307 < T277_U20600_M0> -> 351
307 <- 351 < T320_U20615_M0> -> 353
351 <- 353 < T322_U20620_M0> -> 359
353 <- 359 < T328_U20630_M0> -> 362
359 <- 362 < T331_U20641_M0> -> 380
362 <- 380 < T349_U20669_M0> -> 381
380 <- 381 < T350_U20670_M0> -> 382
381 <- 382 < T351_U20671_M0> -> 376
382 <- 376 < T339_U20673_M0> -> 371
376 <- 371 < T342_U20674_M0> -> 375
371 <- 375 < T344_U20676_M0> -> 374
375 <- 374 < T340_U20677_M0> -> 373
374 <- 373 < T345_U20679_M0> -> 245
373 <- 245 < T213_U20687_M0> -> 370
245 <- 370 < T341_U20690_M0> -> 372
370 <- 372 < T343_U20691_M0> -> 365
372 <- 365 < T334_U20692_M0> -> 384
365 <- 384 < T353_U20694_M0> -> 385
384 <- 385 < T354_U20697_M0> -> 386
385 <- 386 < T355_U20698_M0> -> 387
386 <- 387 < T356_U20700_M0> -> 388
387 <- 388 < T357_U20702_M0> -> 389
388 <- 389 < T358_U20703_M0> -> 390
389 <- 390 < T359_U20704_M0> -> 391
390 <- 391 < T360_U20705_M0> -> 392
391 <- 392 < T361_U20707_M0> -> 393
392 <- 393 < T362_U20711_M0> -> 394
393 <- 394 < T363_U20712_M0> -> 395
394 <- 395 < T364_U20713_M0> -> 396
395 <- 396 < T365_U20714_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W2> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (24 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20451_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T156_U20224_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (17 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20619_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T198_U20411_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T199_U20353_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (17 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20635_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20438_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T212_U20408_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20687_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T215_U20400_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T211_U20403_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T214_U20638_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T216_U20391_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T219_U20381_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T227_U20432_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T228_U20415_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T229_U20416_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T230_U20417_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T231_U20419_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T232_U20424_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T233_U20426_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T234_U20427_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T235_U20428_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T236_U20429_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T237_U20433_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T238_U20435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T239_U20436_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T240_U20455_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T241_U20440_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T242_U20441_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T243_U20466_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T244_U20464_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T245_U20504_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T246_U20530_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T247_U20447_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T248_U20456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T249_U20458_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T250_U20467_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T251_U20559_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T252_U20469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T253_U20470_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T254_U20474_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T255_U20475_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T256_U20476_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T257_U20478_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T258_U20479_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T259_U20480_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T260_U20481_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T261_U20483_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T262_U20484_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T263_U20485_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T264_U20486_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T265_U20488_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T266_U20489_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T267_U20490_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T268_U20492_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T269_U20497_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T270_U20498_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T271_U20500_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T272_U20501_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T273_U20507_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T274_U20509_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T275_U20514_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T277_U20600_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T276_U20537_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T280_U20543_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T282_U20544_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T279_U20542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T283_U20541_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T281_U20535_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T278_U20536_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T284_U20523_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T285_U20525_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T286_U20526_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T287_U20582_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T288_U20578_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T289_U20532_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T290_U20533_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T291_U20546_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T292_U20547_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T293_U20548_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T294_U20553_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T295_U20554_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T296_U20636_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T297_U20560_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T298_U20562_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T299_U20563_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T300_U20591_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T301_U20567_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T302_U20568_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T303_U20574_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T304_U20575_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T305_U20580_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T306_U20581_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T307_U20585_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T308_U20586_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T309_U20587_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T310_U20592_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T311_U20594_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T312_U20595_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T313_U20596_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T314_U20597_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T315_U20602_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T316_U20603_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T317_U20605_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T318_U20606_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T319_U20613_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T320_U20615_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T321_U20616_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T322_U20620_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T323_U20621_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T324_U20622_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T325_U20624_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T326_U20626_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T327_U20627_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T328_U20630_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T329_U20632_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T330_U20639_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T331_U20641_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T332_U20642_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T333_U20643_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T334_U20692_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T335_U20645_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T336_U20648_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T337_U20653_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T341_U20690_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T342_U20674_M0> (8 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T343_U20691_M0> (7 requests):
- 7 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T345_U20679_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T340_U20677_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T344_U20676_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T339_U20673_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T346_U20662_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T347_U20664_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T348_U20665_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T349_U20669_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T350_U20670_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T351_U20671_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T353_U20694_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T354_U20697_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T355_U20698_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T356_U20700_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T357_U20702_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T358_U20703_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T359_U20704_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T360_U20705_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T361_U20707_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T362_U20711_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T363_U20712_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T364_U20713_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T365_U20714_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 05:00:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 184
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |7 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |213|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10406|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |norm|T348_U20665_M0 |HTTP_NORM| | |
3| | | | |
|
| 4|19804 |DIA |WP_RUN | | |high|T352_U20709_M0 |INTERNAL | | |
9| |000|SAPSYS |REPLOAD |
|
| 5| |DIA |WP_KILL| |7 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |8 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |7 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 05:00:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|11 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|11 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|3 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|13 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|12 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T71_U20451_M0 | | |10.54.36.35 |04:49:29| |
|norm|3 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|3 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|SYNC_RFC |T75_U20558_M0 | | |smprd02.niladv.org |04:53:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|24 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|3 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|11 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|3 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|3 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|3 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|3 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|3 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|3 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|4 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|4 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|3 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|3 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|17 | | |
0|
|HTTP_NORMAL |T191_U20619_M0 | | |10.54.36.29 |04:56:48| |
|norm|2 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|4 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|12 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|17 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|4 | | |
0|
|HTTP_NORMAL |T208_U20635_M0 | | |10.54.36.37 |04:57:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T210_U20438_M0 | | |10.54.36.15 |04:49:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|6 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T213_U20687_M0 | | |10.54.36.11 |04:59:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T214_U20638_M0 | | |10.54.36.29 |04:57:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|3 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|3 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|4 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|6 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|4 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T227_U20432_M0 | | |10.54.36.32 |04:48:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T237_U20433_M0 | | |10.54.36.27 |04:48:50| |
|norm|3 | | |
0|
|SYNC_RFC |T238_U20435_M0 | | |smprd02.niladv.org |04:48:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T239_U20436_M0 | | |10.50.47.10 |04:48:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T240_U20455_M0 | | |10.54.36.29 |04:49:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T241_U20440_M0 | | |10.54.36.25 |04:49:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T242_U20441_M0 | | |10.54.36.36 |04:49:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T243_U20466_M0 | | |10.54.36.37 |04:50:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T244_U20464_M0 | | |10.54.36.13 |04:50:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T245_U20504_M0 | | |10.54.36.33 |04:51:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T246_U20530_M0 | | |10.54.36.29 |04:52:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T247_U20447_M0 | | |10.50.47.13 |04:49:12| |
|norm|4 | | |
0|
|SYNC_RFC |T248_U20456_M0 | | |smprd02.niladv.org |04:49:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T249_U20458_M0 | | |10.54.36.33 |04:49:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T250_U20467_M0 | | |10.54.36.28 |04:50:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T251_U20559_M0 | | |10.54.36.32 |04:53:50| |
|norm|15 | | |
0|
|SYNC_RFC |T252_U20469_M0 | | |smprd02.niladv.org |04:50:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T253_U20470_M0 | | |10.54.36.29 |04:50:38| |
|norm|10 | | |
0|
|SYNC_RFC |T254_U20474_M0 | | |smprd02.niladv.org |04:50:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T255_U20475_M0 | | |10.54.36.32 |04:50:50| |
|norm|15 | | |
0|
|HTTP_NORMAL |T256_U20476_M0 | | |10.54.36.27 |04:50:51| |
|norm|4 | | |
0|
|HTTP_NORMAL |T257_U20478_M0 | | |10.50.47.10 |04:50:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T258_U20479_M0 | | |10.54.36.40 |04:50:56| |
|norm|2 | | |
0|
|HTTP_NORMAL |T259_U20480_M0 | | |10.54.36.34 |04:50:57| |
|norm|5 | | |
0|
|HTTP_NORMAL |T260_U20481_M0 | | |10.54.36.19 |04:50:58| |
|norm|15 | | |
0|
|HTTP_NORMAL |T261_U20483_M0 | | |10.54.36.15 |04:51:01| |
|norm|2 | | |
0|
|HTTP_NORMAL |T262_U20484_M0 | | |10.54.36.12 |04:51:01| |
|norm|2 | | |
0|
|HTTP_NORMAL |T263_U20485_M0 | | |10.54.36.17 |04:51:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T264_U20486_M0 | | |10.54.36.38 |04:51:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T265_U20488_M0 | | |10.54.36.25 |04:51:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T266_U20489_M0 | | |10.54.36.30 |04:51:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T267_U20490_M0 | | |10.54.36.36 |04:51:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T268_U20492_M0 | | |10.50.47.13 |04:51:12| |
|norm|2 | | |
0|
|HTTP_NORMAL |T269_U20497_M0 | | |10.54.36.26 |04:51:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T270_U20498_M0 | | |10.54.36.11 |04:51:32| |
|norm|2 | | |
0|
|SYNC_RFC |T271_U20500_M0 | | |smprd02.niladv.org |04:51:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T272_U20501_M0 | | |10.54.36.41 |04:51:39| |
|norm|2 | | |
0|
|SYNC_RFC |T273_U20507_M0 | | |smprd02.niladv.org |04:52:01| |
|norm|1 | | |
0|
|SYNC_RFC |T274_U20509_M0 | | |smprd02.niladv.org |04:52:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T275_U20514_M0 | | |10.54.36.35 |04:52:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T276_U20537_M0 | | |10.54.36.34 |04:52:58| |
|norm|2 | | |
0|
|SYNC_RFC |T277_U20600_M0 | | |smprd02.niladv.org |04:55:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T278_U20536_M0 | | |10.50.47.10 |04:52:53| |
|norm|2 | | |
0|
|SYNC_RFC |T279_U20542_M0 | | |smprd02.niladv.org |04:53:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T280_U20543_M0 | | |10.54.36.12 |04:53:01| |
|norm|5 | | |
0|
|SYNC_RFC |T281_U20535_M0 | | |smprd02.niladv.org |04:52:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T282_U20544_M0 | | |10.54.36.17 |04:53:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T283_U20541_M0 | | |10.54.36.15 |04:53:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T284_U20523_M0 | | |10.54.36.13 |04:52:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T285_U20525_M0 | | |10.54.36.37 |04:52:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T286_U20526_M0 | | |10.54.36.28 |04:52:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T287_U20582_M0 | | |10.54.36.12 |04:55:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T288_U20578_M0 | | |10.54.36.29 |04:54:48| |
|norm|9 | | |
0|
|HTTP_NORMAL |T289_U20532_M0 | | |10.54.36.29 |04:52:48| |
|norm|2 | | |
0|
|SYNC_RFC |T290_U20533_M0 | | |smprd02.niladv.org |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T291_U20546_M0 | | |10.54.36.38 |04:53:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T292_U20547_M0 | | |10.54.36.30 |04:53:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T293_U20548_M0 | | |10.50.47.13 |04:53:12| |
|norm|2 | | |
0|
|HTTP_NORMAL |T294_U20553_M0 | | |10.54.36.26 |04:53:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T295_U20554_M0 | | |10.54.36.11 |04:53:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T296_U20636_M0 | | |10.54.36.14 |04:57:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T297_U20560_M0 | | |10.54.36.27 |04:53:50| |
|norm|4 | | |
0|
|SYNC_RFC |T298_U20562_M0 | | |smprd02.niladv.org |04:53:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T299_U20563_M0 | | |10.54.36.33 |04:53:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T300_U20591_M0 | | |10.54.36.35 |04:55:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T301_U20567_M0 | | |10.54.36.25 |04:54:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T302_U20568_M0 | | |10.54.36.36 |04:54:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T303_U20574_M0 | | |10.54.36.37 |04:54:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T304_U20575_M0 | | |10.54.36.14 |04:54:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T305_U20580_M0 | | |10.50.47.10 |04:54:54| |
|norm|2 | | |
0|
|HTTP_NORMAL |T306_U20581_M0 | | |10.54.36.34 |04:54:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T307_U20585_M0 | | |10.54.36.17 |04:55:05| |
|norm|2 | | |
0|
|ASYNC_RFC |T308_U20586_M0 | | |10.54.36.10 |04:55:10| |
|norm|1 | | |
0|
|HTTP_NORMAL |T309_U20587_M0 | | |10.50.47.13 |04:55:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T310_U20592_M0 | | |10.54.36.13 |04:55:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T311_U20594_M0 | | |10.54.36.28 |04:55:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T312_U20595_M0 | | |10.54.36.26 |04:55:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T313_U20596_M0 | | |10.54.36.29 |04:55:38| |
|norm|10 | | |
0|
|SYNC_RFC |T314_U20597_M0 | | |smprd02.niladv.org |04:55:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T315_U20602_M0 | | |10.54.36.19 |04:55:58| |
|norm|14 | | |
0|
|HTTP_NORMAL |T316_U20603_M0 | | |10.54.36.15 |04:56:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T317_U20605_M0 | | |10.54.36.30 |04:56:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T318_U20606_M0 | | |10.54.36.38 |04:56:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T319_U20613_M0 | | |10.54.36.11 |04:56:32| |
|norm|2 | | |
0|
|SYNC_RFC |T320_U20615_M0 | | |smprd02.niladv.org |04:56:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T321_U20616_M0 | | |10.54.36.41 |04:56:40| |
|norm|2 | | |
0|
|SYNC_RFC |T322_U20620_M0 | | |smprd02.niladv.org |04:56:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T323_U20621_M0 | | |10.54.36.32 |04:56:50| |
|norm|9 | | |
0|
|HTTP_NORMAL |T324_U20622_M0 | | |10.54.36.27 |04:56:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T325_U20624_M0 | | |10.54.36.33 |04:56:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T326_U20626_M0 | | |10.54.36.36 |04:57:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T327_U20627_M0 | | |10.54.36.25 |04:57:03| |
|norm|2 | | |
0|
|SYNC_RFC |T328_U20630_M0 | | |smprd02.niladv.org |04:57:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T329_U20632_M0 | | |10.50.47.13 |04:57:13| |
|norm|3 | | |
0|
|HTTP_NORMAL |T330_U20639_M0 | | |10.54.36.29 |04:57:48| |
|norm|2 | | |
0|
|SYNC_RFC |T331_U20641_M0 | | |smprd02.niladv.org |04:57:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T332_U20642_M0 | | |10.50.47.10 |04:57:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T333_U20643_M0 | | |10.54.36.34 |04:57:58| |
|norm|2 | | |
0|
|SYNC_RFC |T334_U20692_M0 | | |smprd02.niladv.org |04:59:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T335_U20645_M0 | | |10.54.36.12 |04:58:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T336_U20648_M0 | | |10.54.36.17 |04:58:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T337_U20653_M0 | | |10.54.36.35 |04:58:27| |
|norm|2 | | |
0|
|SYNC_RFC |T339_U20673_M0 | | |smprd02.niladv.org |04:58:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T340_U20677_M0 | | |10.54.36.38 |04:59:02| |
|norm|2 | | |
0|
|SYNC_RFC |T341_U20690_M0 | | |smprd02.niladv.org |04:59:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T342_U20674_M0 | | |10.54.36.19 |04:58:58| |
|norm|8 | | |
0|
|HTTP_NORMAL |T343_U20691_M0 | | |10.54.36.29 |04:59:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T344_U20676_M0 | | |10.54.36.15 |04:59:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T345_U20679_M0 | | |10.54.36.30 |04:59:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T346_U20662_M0 | | |10.54.36.13 |04:58:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T347_U20664_M0 | | |10.54.36.26 |04:58:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T348_U20665_M0 | | |10.54.36.28 |05:00:40|3 |
|norm|1 | | |
0|
|SYNC_RFC |T349_U20669_M0 | | |smprd02.niladv.org |04:58:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T350_U20670_M0 | | |10.54.36.32 |04:58:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T351_U20671_M0 | | |10.54.36.27 |04:58:50| |
|norm|2 | | |
0|
|INTERNAL |T352_U20709_M0 |000|SAPSYS | |05:00:34|4 |
|high| | | |
4200|
|HTTP_NORMAL |T353_U20694_M0 | | |10.54.36.33 |04:59:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T354_U20697_M0 | | |10.54.36.25 |05:00:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T355_U20698_M0 | | |10.54.36.36 |05:00:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T356_U20700_M0 | | |10.50.47.13 |05:00:12| |
|norm|1 | | |
0|
|SYNC_RFC |T357_U20702_M0 | | |smprd02.niladv.org |05:00:14| |
|norm|1 | | |
0|
|SYNC_RFC |T358_U20703_M0 | | |smprd02.niladv.org |05:00:16| |
|norm|1 | | |
0|
|SYNC_RFC |T359_U20704_M0 | | |smprd02.niladv.org |05:00:18| |
|norm|1 | | |
0|
|SYNC_RFC |T360_U20705_M0 | | |smprd02.niladv.org |05:00:20| |
|norm|1 | | |
0|
|SYNC_RFC |T361_U20707_M0 | | |smprd02.niladv.org |05:00:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T362_U20711_M0 | | |10.54.36.37 |05:00:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T363_U20712_M0 | | |10.54.36.14 |05:00:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T364_U20713_M0 | | |10.54.36.29 |05:00:38| |
|norm|9 | | |
0|
|SYNC_RFC |T365_U20714_M0 | | |smprd02.niladv.org |05:00:39| |
|norm|1 | | |
0|

Found 365 logons with 365 sessions


Total ES (gross) memory of all sessions: 79 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

Force ABAP stack dump of session T12_U20220_M0


Force ABAP stack dump of session T23_U20175_M0
Force ABAP stack dump of session T38_U20216_M0
Force ABAP stack dump of session T61_U20113_M0

Sun Sep 22 05:00:43:841 2019


Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T121_U20226_M0
Force ABAP stack dump of session T190_U20350_M0
Force ABAP stack dump of session T197_U20347_M0
Force ABAP stack dump of session T200_U20354_M0
Force ABAP stack dump of session T251_U20559_M0
Force ABAP stack dump of session T253_U20470_M0
Force ABAP stack dump of session T255_U20475_M0
Force ABAP stack dump of session T260_U20481_M0
Force ABAP stack dump of session T313_U20596_M0
Force ABAP stack dump of session T315_U20602_M0

RFC-Connection Table (78 entries) Sun Sep 22 05:00:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 26|54305161|54305161SU20692_M0 |T334_U20692_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 32|53831465|53831465SU20509_M0 |T274_U20509_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 33|54303059|54303059SU20690_M0 |T341_U20690_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 44|54184460|54184460SU20641_M0 |T331_U20641_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 57|53869968|53869968SU20533_M0 |T290_U20533_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 67|54240969|54240969SU20669_M0 |T349_U20669_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 73|53627349|53627349SU20435_M0 |T238_U20435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 84|54055969|54055969SU20600_M0 |T277_U20600_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 90|53335572|53335572SU20704_M0 |T359_U20704_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 98|32252616|32252616SU20586_M0 |T308_U20586_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 99|53683971|53683971SU20456_M0 |T248_U20456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 100|53884618|53884618SU20542_M0 |T279_U20542_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 101|53875871|53875871SU20535_M0 |T281_U20535_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 126|54246530|54246530SU20673_M0 |T339_U20673_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 138|53820547|53820547SU20507_M0 |T273_U20507_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 149|53338570|53338570SU20705_M0 |T360_U20705_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 170|53796298|53796298SU20500_M0 |T271_U20500_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 176|54333571|54333571SU20703_M0 |T358_U20703_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 177|54330571|54330571SU20702_M0 |T357_U20702_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 178|53359275|53359275SU20714_M0 |T365_U20714_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 187|53341583|53341583SU20707_M0 |T361_U20707_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 193|53933040|53933040SU20558_M0 |T75_U20558_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 205|53746045|53746045SU20474_M0 |T254_U20474_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 216|54141465|54141465SU20630_M0 |T328_U20630_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 221|54118046|54118046SU20620_M0 |T322_U20620_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 234|54107318|54107318SU20615_M0 |T320_U20615_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 252|53938941|53938941SU20562_M0 |T298_U20562_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 255|53734228|53734228SU20469_M0 |T252_U20469_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 263|54045247|54045247SU20597_M0 |T314_U20597_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 78 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
361 INVALID -1
362 INVALID -1
363 INVALID -1
364 INVALID -1
365 INVALID -1
366 INVALID -1
367 INVALID -1
368 INVALID -1
369 INVALID -1
370 INVALID -1
371 INVALID -1
372 INVALID -1
373 INVALID -1
374 INVALID -1
375 INVALID -1
376 INVALID -1
377 INVALID -1
378 INVALID -1
379 INVALID -1
380 INVALID -1
381 INVALID -1
382 INVALID -1
383 INVALID -1
384 INVALID -1
385 INVALID -1
386 INVALID -1
387 INVALID -1
388 INVALID -1
389 INVALID -1
390 INVALID -1
391 INVALID -1
392 INVALID -1
393 INVALID -1
394 INVALID -1
395 INVALID -1
396 INVALID -1
397 INVALID -1
398 INVALID -1
399 INVALID -1
400 INVALID -1
401 INVALID -1
402 INVALID -1
403 INVALID -1
404 INVALID -1
74 ca_blk slots of 6000 in use, 72 currently unowned (in request queues)

MPI Info Sun Sep 22 05:00:43 2019


------------------------------------------------------------
Current pipes in use: 191
Current / maximal blocks in use: 227 / 1884

Periodic Tasks Sun Sep 22 05:00:43 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5709| 77| |
|
| 1|DDLOG | 5709| 77| |
|
| 2|BTCSCHED | 11416| 21| |
|
| 3|RESTART_ALL | 2284| 253| |
|
| 4|ENVCHECK | 34261| 20| |
|
| 5|AUTOABAP | 2284| 253| |
|
| 6|BGRFC_WATCHDOG | 2285| 253| |
|
| 7|AUTOTH | 299| 21| |
|
| 8|AUTOCCMS | 11415| 21| |
|
| 9|AUTOSECURITY | 11414| 21| |
|
| 10|LOAD_CALCULATION | 684276| 0| |
|
| 11|SPOOLALRM | 11420| 21| |
|
| 12|CALL_DELAYED | 0| 881| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 184 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:00:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:00:53:832 2019


DpHdlSoftCancel: cancel request for T350_U20670_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T351_U20671_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:00:58:994 2019


DpHdlSoftCancel: cancel request for T342_U20674_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:01:03:825 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:01:04:148 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
Sun Sep 22 05:01:05:954 2019
DpHdlSoftCancel: cancel request for T345_U20679_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T344_U20676_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T340_U20677_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:01:23:826 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:01:24:182 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:01:34:004 2019


DpHdlSoftCancel: cancel request for T213_U20687_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:01:43:827 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 1210

Sun Sep 22 05:01:58:217 2019


DpHdlSoftCancel: cancel request for T343_U20691_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:02:03:222 2019


DpHdlSoftCancel: cancel request for T353_U20694_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:02:03:828 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:02:04:149 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:02:09:222 2019


DpHdlSoftCancel: cancel request for T354_U20697_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T355_U20698_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:02:14:227 2019


DpHdlSoftCancel: cancel request for T356_U20700_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:02:22:728 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:02:23:829 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 1210 terminated

Sun Sep 22 05:02:24:641 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:02:38:246 2019


DpHdlSoftCancel: cancel request for T363_U20712_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T362_U20711_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:02:43:829 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:02:44:659 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:02:53:252 2019


DpHdlSoftCancel: cancel request for T366_U20718_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T338_U20717_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:02:58:255 2019


DpHdlSoftCancel: cancel request for T368_U20721_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T367_U20720_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:03:03:257 2019


DpHdlSoftCancel: cancel request for T348_U20723_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T370_U20724_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T371_U20725_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T369_U20722_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:03:03:829 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:03:04:156 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:03:08:261 2019


DpHdlSoftCancel: cancel request for T401_U20776_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T372_U20727_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T374_U20729_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:03:23:830 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:03:24:336 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:03:28:838 2019


DpHdlSoftCancel: cancel request for T373_U20738_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:03:33:841 2019


DpHdlSoftCancel: cancel request for T380_U20739_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T381_U20740_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T382_U20741_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T383_U20742_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:03:38:846 2019


DpHdlSoftCancel: cancel request for T400_U20775_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:03:43:830 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
DpHdlSoftCancel: cancel request for T384_U20744_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:04:03:370 2019


DpHdlSoftCancel: cancel request for T389_U20754_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T387_U20751_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T386_U20750_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:04:03:831 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:04:04:150 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:04:04:372 2019


DpHdlSoftCancel: cancel request for T390_U20755_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:04:23:832 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:04:38:401 2019


DpHdlSoftCancel: cancel request for T397_U20769_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T398_U20770_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:04:40:801 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:04:43:832 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:04:48:852 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.902120 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.902203 /
0.000000

Sun Sep 22 05:04:52:856 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 05:04:53:411 2019


DpHdlSoftCancel: cancel request for T399_U20774_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T404_U20779_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T403_U20778_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:04:59:449 2019


DpHdlSoftCancel: cancel request for T407_U20783_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T405_U20781_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:05:03:833 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-14354
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-14355
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-14356
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-14357
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-14358

Sun Sep 22 05:05:04:150 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:05:05:224 2019


*** ERROR => DpHdlDeadWp: W0 (pid 14354) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14354) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 14354)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 14355) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14355) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 14355)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 14356) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14356) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 14356)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 14357) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14357) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 14357)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 14358) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=14358) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 14358)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:05:05:821 2019


DpHdlSoftCancel: cancel request for T412_U20789_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T413_U20790_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T411_U20788_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T409_U20785_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T410_U20786_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:05:17:107 2019


DpHdlSoftCancel: cancel request for T414_U20793_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:05:20:835 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:05:23:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:05:33:122 2019


DpHdlSoftCancel: cancel request for T422_U20805_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T267_U20801_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T270_U20802_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T420_U20803_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T421_U20804_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:05:40:853 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:05:43:129 2019


DpHdlSoftCancel: cancel request for T423_U20807_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T364_U20713_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:05:43:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:06:03:834 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:06:04:152 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:06:05:875 2019


DpHdlSoftCancel: cancel request for T431_U20819_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T427_U20814_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T426_U20813_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T429_U20817_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:06:23:835 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:01:43 2019, skip new
snapshot

Sun Sep 22 05:06:33:894 2019


DpHdlSoftCancel: cancel request for T446_U20840_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:06:38:895 2019


DpHdlSoftCancel: cancel request for T447_U20841_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:06:40:791 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:06:43:835 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 185 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:06:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 05:06:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 2
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 05:06:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 1467 (peak 1470, writeCount 24105128, readCount 24103661)


UPD : 0 (peak 31, writeCount 4959, readCount 4959)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125403, readCount 2125403)
SPO : 0 (peak 2, writeCount 25108, readCount 25108)
UP2 : 0 (peak 1, writeCount 2345, readCount 2345)
DISP: 0 (peak 67, writeCount 890046, readCount 890046)
GW : 0 (peak 49, writeCount 22411021, readCount 22411021)
ICM : 0 (peak 186, writeCount 391091, readCount 391091)
LWP : 1 (peak 16, writeCount 38262, readCount 38261)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 465 elements, peak 466):
-1 <- 350 < T319_U20613_M0> -> 477
350 <- 477 < T446_U20840_M0> -> 348
477 <- 348 < T317_U20605_M0> -> 347
348 <- 347 < T316_U20603_M0> -> 349
347 <- 349 < T318_U20606_M0> -> 346
349 <- 346 < T315_U20602_M0> -> 460
346 <- 460 < T429_U20817_M0> -> 457
460 <- 457 < T426_U20813_M0> -> 458
457 <- 458 < T427_U20814_M0> -> 462
458 <- 462 < T431_U20819_M0> -> 344
462 <- 344 < T313_U20596_M0> -> 343
344 <- 343 < T312_U20595_M0> -> 342
343 <- 342 < T311_U20594_M0> -> 395
342 <- 395 < T364_U20713_M0> -> 454
395 <- 454 < T423_U20807_M0> -> 341
454 <- 341 < T310_U20592_M0> -> 452
341 <- 452 < T421_U20804_M0> -> 451
452 <- 451 < T420_U20803_M0> -> 301
451 <- 301 < T270_U20802_M0> -> 298
301 <- 298 < T267_U20801_M0> -> 453
298 <- 453 < T422_U20805_M0> -> 340
453 <- 340 < T309_U20587_M0> -> 338
340 <- 338 < T307_U20585_M0> -> 445
338 <- 445 < T414_U20793_M0> -> 318
445 <- 318 < T287_U20582_M0> -> 337
318 <- 337 < T306_U20581_M0> -> 336
337 <- 336 < T305_U20580_M0> -> 441
336 <- 441 < T410_U20786_M0> -> 440
441 <- 440 < T409_U20785_M0> -> 442
440 <- 442 < T411_U20788_M0> -> 444
442 <- 444 < T413_U20790_M0> -> 443
444 <- 443 < T412_U20789_M0> -> 319
443 <- 319 < T288_U20578_M0> -> 436
319 <- 436 < T405_U20781_M0> -> 438
436 <- 438 < T407_U20783_M0> -> 435
438 <- 435 < T403_U20778_M0> -> 434
435 <- 434 < T404_U20779_M0> -> 430
434 <- 430 < T399_U20774_M0> -> 334
430 <- 334 < T303_U20574_M0> -> 335
334 <- 335 < T304_U20575_M0> -> 428
335 <- 428 < T397_U20769_M0> -> 333
428 <- 333 < T302_U20568_M0> -> 332
333 <- 332 < T301_U20567_M0> -> 330
332 <- 330 < T299_U20563_M0> -> 421
330 <- 421 < T390_U20755_M0> -> 282
421 <- 282 < T251_U20559_M0> -> 417
282 <- 417 < T386_U20750_M0> -> 328
417 <- 328 < T297_U20560_M0> -> 418
328 <- 418 < T387_U20751_M0> -> 420
418 <- 420 < T389_U20754_M0> -> 325
420 <- 325 < T294_U20553_M0> -> 326
325 <- 326 < T295_U20554_M0> -> 431
326 <- 431 < T400_U20775_M0> -> 414
431 <- 414 < T383_U20742_M0> -> 413
414 <- 413 < T382_U20741_M0> -> 412
413 <- 412 < T381_U20740_M0> -> 411
412 <- 411 < T380_U20739_M0> -> 404
411 <- 404 < T373_U20738_M0> -> 309
404 <- 309 < T280_U20543_M0> -> 322
309 <- 322 < T291_U20546_M0> -> 312
322 <- 312 < T283_U20541_M0> -> 310
312 <- 310 < T282_U20544_M0> -> 323
310 <- 323 < T292_U20547_M0> -> 405
323 <- 405 < T374_U20729_M0> -> 308
405 <- 308 < T276_U20537_M0> -> 403
308 <- 403 < T372_U20727_M0> -> 314
403 <- 314 < T278_U20536_M0> -> 432
314 <- 432 < T401_U20776_M0> -> 400
432 <- 400 < T369_U20722_M0> -> 277
400 <- 277 < T246_U20530_M0> -> 402
277 <- 402 < T371_U20725_M0> -> 401
402 <- 401 < T370_U20724_M0> -> 320
401 <- 320 < T289_U20532_M0> -> 379
320 <- 379 < T348_U20723_M0> -> 398
379 <- 398 < T367_U20720_M0> -> 399
398 <- 399 < T368_U20721_M0> -> 369
399 <- 369 < T338_U20717_M0> -> 397
369 <- 397 < T366_U20718_M0> -> 317
397 <- 317 < T286_U20526_M0> -> 316
317 <- 316 < T285_U20525_M0> -> 315
316 <- 315 < T284_U20523_M0> -> 393
315 <- 393 < T362_U20711_M0> -> 394
393 <- 394 < T363_U20712_M0> -> 306
394 <- 306 < T275_U20514_M0> -> 387
306 <- 387 < T356_U20700_M0> -> 386
387 <- 386 < T355_U20698_M0> -> 385
386 <- 385 < T354_U20697_M0> -> 276
385 <- 276 < T245_U20504_M0> -> 384
276 <- 384 < T353_U20694_M0> -> 372
384 <- 372 < T343_U20691_M0> -> 303
372 <- 303 < T272_U20501_M0> -> 300
303 <- 300 < T269_U20497_M0> -> 245
300 <- 245 < T213_U20687_M0> -> 299
245 <- 299 < T268_U20492_M0> -> 297
299 <- 297 < T266_U20489_M0> -> 296
297 <- 296 < T265_U20488_M0> -> 292
296 <- 292 < T261_U20483_M0> -> 291
292 <- 291 < T260_U20481_M0> -> 294
291 <- 294 < T263_U20485_M0> -> 295
294 <- 295 < T264_U20486_M0> -> 293
295 <- 293 < T262_U20484_M0> -> 374
293 <- 374 < T340_U20677_M0> -> 375
374 <- 375 < T344_U20676_M0> -> 373
375 <- 373 < T345_U20679_M0> -> 290
373 <- 290 < T259_U20480_M0> -> 289
290 <- 289 < T258_U20479_M0> -> 288
289 <- 288 < T257_U20478_M0> -> 286
288 <- 286 < T255_U20475_M0> -> 371
286 <- 371 < T342_U20674_M0> -> 382
371 <- 382 < T351_U20671_M0> -> 381
382 <- 381 < T350_U20670_M0> -> 284
381 <- 284 < T253_U20470_M0> -> 281
284 <- 281 < T250_U20467_M0> -> 274
281 <- 274 < T243_U20466_M0> -> 231
274 <- 231 < T200_U20354_M0> -> 228
231 <- 228 < T197_U20347_M0> -> 221
228 <- 221 < T190_U20350_M0> -> 62
221 <- 62 < T121_U20226_M0> -> 78
62 <- 78 < T88_U20160_M0> -> 69
78 <- 69 < T61_U20113_M0> -> 100
69 <- 100 < T38_U20216_M0> -> 54
100 <- 54 < T23_U20175_M0> -> 89
54 <- 89 < T12_U20220_M0> -> 275
89 <- 275 < T244_U20464_M0> -> 378
275 <- 378 < T347_U20664_M0> -> 377
378 <- 377 < T346_U20662_M0> -> 368
377 <- 368 < T337_U20653_M0> -> 367
368 <- 367 < T336_U20648_M0> -> 280
367 <- 280 < T249_U20458_M0> -> 366
280 <- 366 < T335_U20645_M0> -> 364
366 <- 364 < T333_U20643_M0> -> 271
364 <- 271 < T240_U20455_M0> -> 363
271 <- 363 < T332_U20642_M0> -> 135
363 <- 135 < T71_U20451_M0> -> 327
135 <- 327 < T296_U20636_M0> -> 239
327 <- 239 < T208_U20635_M0> -> 278
239 <- 278 < T247_U20447_M0> -> 360
278 <- 360 < T329_U20632_M0> -> 273
360 <- 273 < T242_U20441_M0> -> 272
273 <- 272 < T241_U20440_M0> -> 241
272 <- 241 < T210_U20438_M0> -> 270
241 <- 270 < T239_U20436_M0> -> 268
270 <- 268 < T237_U20433_M0> -> 258
268 <- 258 < T227_U20432_M0> -> 356
258 <- 356 < T325_U20624_M0> -> 357
356 <- 357 < T326_U20626_M0> -> 358
357 <- 358 < T327_U20627_M0> -> 355
358 <- 355 < T324_U20622_M0> -> 354
355 <- 354 < T323_U20621_M0> -> 222
354 <- 222 < T191_U20619_M0> -> 267
222 <- 267 < T236_U20429_M0> -> 265
267 <- 265 < T234_U20427_M0> -> 266
265 <- 266 < T235_U20428_M0> -> 264
266 <- 264 < T233_U20426_M0> -> 352
264 <- 352 < T321_U20616_M0> -> 263
352 <- 263 < T232_U20424_M0> -> 248
263 <- 248 < T214_U20638_M0> -> 262
248 <- 262 < T231_U20419_M0> -> 261
262 <- 261 < T230_U20417_M0> -> 260
261 <- 260 < T229_U20416_M0> -> 229
260 <- 229 < T198_U20411_M0> -> 259
229 <- 259 < T228_U20415_M0> -> 257
259 <- 257 < T226_U20412_M0> -> 119
257 <- 119 < T11_U20406_M0> -> 242
119 <- 242 < T212_U20408_M0> -> 361
242 <- 361 < T330_U20639_M0> -> 247
361 <- 247 < T211_U20403_M0> -> 246
247 <- 246 < T215_U20400_M0> -> 255
246 <- 255 < T224_U20389_M0> -> 249
255 <- 249 < T216_U20391_M0> -> 253
249 <- 253 < T222_U20386_M0> -> 252
253 <- 252 < T221_U20383_M0> -> 240
252 <- 240 < T209_U20370_M0> -> 250
240 <- 250 < T219_U20381_M0> -> 237
250 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 236
238 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 230
233 <- 230 < T199_U20353_M0> -> 192
230 <- 192 < T161_U20349_M0> -> 226
192 <- 226 < T195_U20345_M0> -> 225
226 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 220
224 <- 220 < T189_U20331_M0> -> 219
220 <- 219 < T188_U20326_M0> -> 218
219 <- 218 < T187_U20325_M0> -> 213
218 <- 213 < T182_U20314_M0> -> 216
213 <- 216 < T185_U20317_M0> -> 212
216 <- 212 < T181_U20313_M0> -> 215
212 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 210
211 <- 210 < T179_U20309_M0> -> 206
210 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 205
208 <- 205 < T174_U20300_M0> -> 203
205 <- 203 < T172_U20298_M0> -> 204
203 <- 204 < T173_U20299_M0> -> 202
204 <- 202 < T171_U20295_M0> -> 201
202 <- 201 < T170_U20291_M0> -> 200
201 <- 200 < T169_U20289_M0> -> 199
200 <- 199 < T168_U20288_M0> -> 196
199 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 103
195 <- 103 < T145_U20282_M0> -> 193
103 <- 193 < T162_U20279_M0> -> 91
193 <- 91 < T19_U20278_M0> -> 189
91 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 182
191 <- 182 < T151_U20265_M0> -> 176
182 <- 176 < T157_U20264_M0> -> 190
176 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 184
158 <- 184 < T135_U20255_M0> -> 180
184 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 188
154 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 160
131 <- 160 < T148_U20238_M0> -> 175
160 <- 175 < T144_U20241_M0> -> 183
175 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 146
152 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 76
166 <- 76 < T20_U20221_M0> -> 138
76 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 56
74 <- 56 < T35_U20203_M0> -> 59
56 <- 59 < T25_U20202_M0> -> 45
59 <- 45 < T76_U20199_M0> -> 39
45 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 148
171 <- 148 < T123_U20185_M0> -> 153
148 <- 153 < T129_U20184_M0> -> 147
153 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 129
170 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 58
144 <- 58 < T77_U20172_M0> -> 117
58 <- 117 < T31_U20168_M0> -> 60
117 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 163
124 <- 163 < T133_U20159_M0> -> 70
163 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 118
114 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 125
105 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 94
42 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 93
167 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 31
132 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 92
43 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 32
113 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 79
109 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 50
73 <- 50 < T80_U19969_M0> -> 120
50 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 251
227 <- 251 < T220_U20382_M0> -> 254
251 <- 254 < T223_U20387_M0> -> 244
254 <- 244 < T218_U20394_M0> -> 243
244 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 269
256 <- 269 < T238_U20435_M0> -> 279
269 <- 279 < T248_U20456_M0> -> 283
279 <- 283 < T252_U20469_M0> -> 285
283 <- 285 < T254_U20474_M0> -> 302
285 <- 302 < T271_U20500_M0> -> 304
302 <- 304 < T273_U20507_M0> -> 305
304 <- 305 < T274_U20509_M0> -> 321
305 <- 321 < T290_U20533_M0> -> 313
321 <- 313 < T281_U20535_M0> -> 311
313 <- 311 < T279_U20542_M0> -> 165
311 <- 165 < T75_U20558_M0> -> 329
165 <- 329 < T298_U20562_M0> -> 339
329 <- 339 < T308_U20586_M0> -> 345
339 <- 345 < T314_U20597_M0> -> 307
345 <- 307 < T277_U20600_M0> -> 351
307 <- 351 < T320_U20615_M0> -> 353
351 <- 353 < T322_U20620_M0> -> 359
353 <- 359 < T328_U20630_M0> -> 362
359 <- 362 < T331_U20641_M0> -> 380
362 <- 380 < T349_U20669_M0> -> 376
380 <- 376 < T339_U20673_M0> -> 370
376 <- 370 < T341_U20690_M0> -> 365
370 <- 365 < T334_U20692_M0> -> 388
365 <- 388 < T357_U20702_M0> -> 389
388 <- 389 < T358_U20703_M0> -> 390
389 <- 390 < T359_U20704_M0> -> 391
390 <- 391 < T360_U20705_M0> -> 392
391 <- 392 < T361_U20707_M0> -> 396
392 <- 396 < T365_U20714_M0> -> 383
396 <- 383 < T352_U20716_M0> -> 406
383 <- 406 < T375_U20731_M0> -> 407
406 <- 407 < T376_U20732_M0> -> 408
407 <- 408 < T377_U20733_M0> -> 409
408 <- 409 < T378_U20734_M0> -> 410
409 <- 410 < T379_U20736_M0> -> 416
410 <- 416 < T385_U20745_M0> -> 287
416 <- 287 < T256_U20748_M0> -> 419
287 <- 419 < T388_U20752_M0> -> 422
419 <- 422 < T391_U20758_M0> -> 423
422 <- 423 < T392_U20760_M0> -> 424
423 <- 424 < T393_U20761_M0> -> 425
424 <- 425 < T394_U20762_M0> -> 426
425 <- 426 < T395_U20763_M0> -> 427
426 <- 427 < T396_U20764_M0> -> 433
427 <- 433 < T402_U20777_M0> -> 437
433 <- 437 < T406_U20782_M0> -> 439
437 <- 439 < T408_U20784_M0> -> 446
439 <- 446 < T415_U20795_M0> -> 447
446 <- 447 < T416_U20796_M0> -> 448
447 <- 448 < T417_U20797_M0> -> 449
448 <- 449 < T418_U20798_M0> -> 450
449 <- 450 < T419_U20799_M0> -> 324
450 <- 324 < T293_U20809_M0> -> 455
324 <- 455 < T424_U20810_M0> -> 456
455 <- 456 < T425_U20812_M0> -> 461
456 <- 461 < T430_U20818_M0> -> 463
461 <- 463 < T432_U20821_M0> -> 464
463 <- 464 < T433_U20823_M0> -> 465
464 <- 465 < T434_U20824_M0> -> 466
465 <- 466 < T435_U20825_M0> -> 467
466 <- 467 < T436_U20827_M0> -> 468
467 <- 468 < T437_U20828_M0> -> 459
468 <- 459 < T428_U20844_M0> -> 479
459 <- 479 < T448_U20845_M0> -> 476
479 <- 476 < T441_U20847_M0> -> 475
476 <- 475 < T445_U20848_M0> -> 474
475 <- 474 < T443_U20850_M0> -> 473
474 <- 473 < T442_U20851_M0> -> 471
473 <- 471 < T444_U20853_M0> -> 472
471 <- 472 < T440_U20854_M0> -> 470
472 <- 470 < T439_U20856_M0> -> 469
470 <- 469 < T438_U20857_M0> -> 480
469 <- 480 < T449_U20859_M0> -> 481
480 <- 481 < T450_U20860_M0> -> 482
481 <- 482 < T451_U20861_M0> -> 483
482 <- 483 < T452_U20863_M0> -> 484
483 <- 484 < T453_U20864_M0> -> 485
484 <- 485 < T454_U20867_M0> -> 486
485 <- 486 < T455_U20868_M0> -> 487
486 <- 487 < T456_U20869_M0> -> 488
487 <- 488 < T457_U20870_M0> -> 489
488 <- 489 < T458_U20872_M0> -> 490
489 <- 490 < T459_U20873_M0> -> 491
490 <- 491 < T460_U20874_M0> -> 415
491 <- 415 < T384_U20876_M0> -> 429
415 <- 429 < T398_U20877_M0> -> 492
429 <- 492 < T461_U20878_M0> -> 493
492 <- 493 < T462_U20880_M0> -> 494
493 <- 494 < T463_U20881_M0> -> 496
494 <- 496 < T465_U20884_M0> -> 497
496 <- 497 < T466_U20886_M0> -> 498
497 <- 498 < T467_U20887_M0> -> 499
498 <- 499 < T468_U20888_M0> -> 500
499 <- 500 < T469_U20890_M0> -> 501
500 <- 501 < T470_U20891_M0> -> 502
501 <- 502 < T471_U20892_M0> -> 503
502 <- 503 < T472_U20893_M0> -> 504
503 <- 504 < T473_U20896_M0> -> 505
504 <- 505 < T474_U20897_M0> -> 506
505 <- 506 < T475_U20899_M0> -> 507
506 <- 507 < T476_U20901_M0> -> 508
507 <- 508 < T477_U20902_M0> -> 509
508 <- 509 < T478_U20903_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W2> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (13 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (25 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (14 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20451_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T156_U20224_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (18 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20619_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T198_U20411_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T199_U20353_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (18 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20635_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20438_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T212_U20408_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20687_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T215_U20400_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T211_U20403_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T214_U20638_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T216_U20391_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T219_U20381_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T227_U20432_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T228_U20415_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T229_U20416_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T230_U20417_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T231_U20419_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T232_U20424_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T233_U20426_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T234_U20427_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T235_U20428_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T236_U20429_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T237_U20433_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T238_U20435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T239_U20436_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T240_U20455_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T241_U20440_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T242_U20441_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T243_U20466_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T244_U20464_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T245_U20504_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T246_U20530_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T247_U20447_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T248_U20456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T249_U20458_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T250_U20467_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T251_U20559_M0> (17 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T252_U20469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T253_U20470_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T254_U20474_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T255_U20475_M0> (17 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T256_U20748_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T257_U20478_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T258_U20479_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T259_U20480_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T260_U20481_M0> (17 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T261_U20483_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T262_U20484_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T263_U20485_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T264_U20486_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T265_U20488_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T266_U20489_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T267_U20801_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T268_U20492_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T269_U20497_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T270_U20802_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T271_U20500_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T272_U20501_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T273_U20507_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T274_U20509_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T275_U20514_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T277_U20600_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T276_U20537_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T280_U20543_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T282_U20544_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T279_U20542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T283_U20541_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T281_U20535_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T278_U20536_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T284_U20523_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T285_U20525_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T286_U20526_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T287_U20582_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T288_U20578_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T289_U20532_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T290_U20533_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T291_U20546_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T292_U20547_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T293_U20809_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T294_U20553_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T295_U20554_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T296_U20636_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T297_U20560_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T298_U20562_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T299_U20563_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T301_U20567_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T302_U20568_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T303_U20574_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T304_U20575_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T305_U20580_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T306_U20581_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T307_U20585_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T308_U20586_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T309_U20587_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T310_U20592_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T311_U20594_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T312_U20595_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T313_U20596_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T314_U20597_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T315_U20602_M0> (16 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T316_U20603_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T317_U20605_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T318_U20606_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T319_U20613_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T320_U20615_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T321_U20616_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T322_U20620_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T323_U20621_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T324_U20622_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T325_U20624_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T326_U20626_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T327_U20627_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T328_U20630_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T329_U20632_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T330_U20639_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T331_U20641_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T332_U20642_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T333_U20643_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T334_U20692_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T335_U20645_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T336_U20648_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T337_U20653_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T338_U20717_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T341_U20690_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T342_U20674_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T343_U20691_M0> (8 requests):
- 7 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T345_U20679_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T340_U20677_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T344_U20676_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T339_U20673_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T346_U20662_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T347_U20664_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T348_U20723_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T349_U20669_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T350_U20670_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T351_U20671_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T352_U20716_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T353_U20694_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T354_U20697_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T355_U20698_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T356_U20700_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T357_U20702_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T358_U20703_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T359_U20704_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T360_U20705_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T361_U20707_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T362_U20711_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T363_U20712_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T364_U20713_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T365_U20714_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T366_U20718_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T367_U20720_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T368_U20721_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T369_U20722_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T370_U20724_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T371_U20725_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T372_U20727_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T373_U20738_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T374_U20729_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T375_U20731_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T376_U20732_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T377_U20733_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T378_U20734_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T379_U20736_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T380_U20739_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T381_U20740_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T382_U20741_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T383_U20742_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T384_U20876_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T385_U20745_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T386_U20750_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T387_U20751_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T388_U20752_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T389_U20754_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T390_U20755_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T391_U20758_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T392_U20760_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T393_U20761_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T394_U20762_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T395_U20763_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T396_U20764_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T397_U20769_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T398_U20877_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T399_U20774_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T400_U20775_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T401_U20776_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T402_U20777_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T404_U20779_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T403_U20778_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T405_U20781_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T406_U20782_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T407_U20783_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T408_U20784_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T409_U20785_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T410_U20786_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T411_U20788_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T412_U20789_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T413_U20790_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T414_U20793_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T415_U20795_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T416_U20796_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T417_U20797_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T418_U20798_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T419_U20799_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T420_U20803_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T421_U20804_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T422_U20805_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T423_U20807_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T424_U20810_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T425_U20812_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T426_U20813_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T427_U20814_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T428_U20844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T429_U20817_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T430_U20818_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T431_U20819_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T432_U20821_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T433_U20823_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T434_U20824_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T435_U20825_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T436_U20827_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T437_U20828_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T438_U20857_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T439_U20856_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T444_U20853_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T440_U20854_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T442_U20851_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T443_U20850_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T445_U20848_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T441_U20847_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T446_U20840_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T447_U20841_M0> (2 requests, queue in use):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T448_U20845_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T449_U20859_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T450_U20860_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T451_U20861_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T452_U20863_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T453_U20864_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T454_U20867_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T455_U20868_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T456_U20869_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T457_U20870_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T458_U20872_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T459_U20873_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T460_U20874_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T461_U20878_M0> (15 requests):
- 15 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T462_U20880_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T463_U20881_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T465_U20884_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T466_U20886_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T467_U20887_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T468_U20888_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T469_U20890_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T470_U20891_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T471_U20892_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T472_U20893_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T473_U20896_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T474_U20897_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T475_U20899_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T476_U20901_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T477_U20902_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T478_U20903_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 05:06:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 185
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |8 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |214|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10442|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |norm|T447_U20841_M0 |HTTP_NORM| | |
3| | | | |
|
| 4|19804 |DIA |WP_RUN | | |high|T464_U20904_M0 |INTERNAL | | |
3| | | | |
|
| 5| |DIA |WP_KILL| |8 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |9 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |8 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 05:06:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|12 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|12 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|3 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|14 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|13 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T71_U20451_M0 | | |10.54.36.35 |04:49:29| |
|norm|3 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|3 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|SYNC_RFC |T75_U20558_M0 | | |smprd02.niladv.org |04:53:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|25 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|3 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|12 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|3 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|3 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|3 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|3 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|3 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|3 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|4 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|4 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|3 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|3 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|18 | | |
0|
|HTTP_NORMAL |T191_U20619_M0 | | |10.54.36.29 |04:56:48| |
|norm|2 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|4 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|13 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|18 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|4 | | |
0|
|HTTP_NORMAL |T208_U20635_M0 | | |10.54.36.37 |04:57:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T210_U20438_M0 | | |10.54.36.15 |04:49:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|6 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T213_U20687_M0 | | |10.54.36.11 |04:59:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T214_U20638_M0 | | |10.54.36.29 |04:57:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|3 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|3 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|4 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|6 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|4 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T227_U20432_M0 | | |10.54.36.32 |04:48:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T237_U20433_M0 | | |10.54.36.27 |04:48:50| |
|norm|3 | | |
0|
|SYNC_RFC |T238_U20435_M0 | | |smprd02.niladv.org |04:48:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T239_U20436_M0 | | |10.50.47.10 |04:48:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T240_U20455_M0 | | |10.54.36.29 |04:49:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T241_U20440_M0 | | |10.54.36.25 |04:49:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T242_U20441_M0 | | |10.54.36.36 |04:49:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T243_U20466_M0 | | |10.54.36.37 |04:50:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T244_U20464_M0 | | |10.54.36.13 |04:50:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T245_U20504_M0 | | |10.54.36.33 |04:51:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T246_U20530_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T247_U20447_M0 | | |10.50.47.13 |04:49:12| |
|norm|4 | | |
0|
|SYNC_RFC |T248_U20456_M0 | | |smprd02.niladv.org |04:49:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T249_U20458_M0 | | |10.54.36.33 |04:49:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T250_U20467_M0 | | |10.54.36.28 |04:50:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T251_U20559_M0 | | |10.54.36.32 |04:53:50| |
|norm|17 | | |
0|
|SYNC_RFC |T252_U20469_M0 | | |smprd02.niladv.org |04:50:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T253_U20470_M0 | | |10.54.36.29 |04:50:38| |
|norm|12 | | |
0|
|SYNC_RFC |T254_U20474_M0 | | |smprd02.niladv.org |04:50:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T255_U20475_M0 | | |10.54.36.32 |04:50:50| |
|norm|17 | | |
0|
|SYNC_RFC |T256_U20748_M0 | | |smprd02.niladv.org |05:01:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T257_U20478_M0 | | |10.50.47.10 |04:50:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T258_U20479_M0 | | |10.54.36.40 |04:50:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T259_U20480_M0 | | |10.54.36.34 |04:50:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T260_U20481_M0 | | |10.54.36.19 |04:50:58| |
|norm|17 | | |
0|
|HTTP_NORMAL |T261_U20483_M0 | | |10.54.36.15 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T262_U20484_M0 | | |10.54.36.12 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T263_U20485_M0 | | |10.54.36.17 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T264_U20486_M0 | | |10.54.36.38 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T265_U20488_M0 | | |10.54.36.25 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T266_U20489_M0 | | |10.54.36.30 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T267_U20801_M0 | | |10.54.36.35 |05:03:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T268_U20492_M0 | | |10.50.47.13 |04:51:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T269_U20497_M0 | | |10.54.36.26 |04:51:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T270_U20802_M0 | | |10.54.36.13 |05:03:30| |
|norm|3 | | |
0|
|SYNC_RFC |T271_U20500_M0 | | |smprd02.niladv.org |04:51:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T272_U20501_M0 | | |10.54.36.41 |04:51:39| |
|norm|3 | | |
0|
|SYNC_RFC |T273_U20507_M0 | | |smprd02.niladv.org |04:52:01| |
|norm|1 | | |
0|
|SYNC_RFC |T274_U20509_M0 | | |smprd02.niladv.org |04:52:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T275_U20514_M0 | | |10.54.36.35 |04:52:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T276_U20537_M0 | | |10.54.36.34 |04:52:58| |
|norm|3 | | |
0|
|SYNC_RFC |T277_U20600_M0 | | |smprd02.niladv.org |04:55:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T278_U20536_M0 | | |10.50.47.10 |04:52:53| |
|norm|3 | | |
0|
|SYNC_RFC |T279_U20542_M0 | | |smprd02.niladv.org |04:53:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T280_U20543_M0 | | |10.54.36.12 |04:53:01| |
|norm|6 | | |
0|
|SYNC_RFC |T281_U20535_M0 | | |smprd02.niladv.org |04:52:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T282_U20544_M0 | | |10.54.36.17 |04:53:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T283_U20541_M0 | | |10.54.36.15 |04:53:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T284_U20523_M0 | | |10.54.36.13 |04:52:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T285_U20525_M0 | | |10.54.36.37 |04:52:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T286_U20526_M0 | | |10.54.36.28 |04:52:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T287_U20582_M0 | | |10.54.36.12 |04:55:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T288_U20578_M0 | | |10.54.36.29 |04:54:48| |
|norm|10 | | |
0|
|HTTP_NORMAL |T289_U20532_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|SYNC_RFC |T290_U20533_M0 | | |smprd02.niladv.org |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T291_U20546_M0 | | |10.54.36.38 |04:53:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T292_U20547_M0 | | |10.54.36.30 |04:53:05| |
|norm|4 | | |
0|
|SYNC_RFC |T293_U20809_M0 | | |smprd02.niladv.org |05:03:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T294_U20553_M0 | | |10.54.36.26 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T295_U20554_M0 | | |10.54.36.11 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T296_U20636_M0 | | |10.54.36.14 |04:57:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T297_U20560_M0 | | |10.54.36.27 |04:53:50| |
|norm|5 | | |
0|
|SYNC_RFC |T298_U20562_M0 | | |smprd02.niladv.org |04:53:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T299_U20563_M0 | | |10.54.36.33 |04:53:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T301_U20567_M0 | | |10.54.36.25 |04:54:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T302_U20568_M0 | | |10.54.36.36 |04:54:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T303_U20574_M0 | | |10.54.36.37 |04:54:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T304_U20575_M0 | | |10.54.36.14 |04:54:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T305_U20580_M0 | | |10.50.47.10 |04:54:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T306_U20581_M0 | | |10.54.36.34 |04:54:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T307_U20585_M0 | | |10.54.36.17 |04:55:05| |
|norm|3 | | |
0|
|ASYNC_RFC |T308_U20586_M0 | | |10.54.36.10 |04:55:10| |
|norm|1 | | |
0|
|HTTP_NORMAL |T309_U20587_M0 | | |10.50.47.13 |04:55:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T310_U20592_M0 | | |10.54.36.13 |04:55:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T311_U20594_M0 | | |10.54.36.28 |04:55:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T312_U20595_M0 | | |10.54.36.26 |04:55:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T313_U20596_M0 | | |10.54.36.29 |04:55:38| |
|norm|12 | | |
0|
|SYNC_RFC |T314_U20597_M0 | | |smprd02.niladv.org |04:55:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T315_U20602_M0 | | |10.54.36.19 |04:55:58| |
|norm|16 | | |
0|
|HTTP_NORMAL |T316_U20603_M0 | | |10.54.36.15 |04:56:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T317_U20605_M0 | | |10.54.36.30 |04:56:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T318_U20606_M0 | | |10.54.36.38 |04:56:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T319_U20613_M0 | | |10.54.36.11 |04:56:32| |
|norm|3 | | |
0|
|SYNC_RFC |T320_U20615_M0 | | |smprd02.niladv.org |04:56:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T321_U20616_M0 | | |10.54.36.41 |04:56:40| |
|norm|2 | | |
0|
|SYNC_RFC |T322_U20620_M0 | | |smprd02.niladv.org |04:56:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T323_U20621_M0 | | |10.54.36.32 |04:56:50| |
|norm|9 | | |
0|
|HTTP_NORMAL |T324_U20622_M0 | | |10.54.36.27 |04:56:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T325_U20624_M0 | | |10.54.36.33 |04:56:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T326_U20626_M0 | | |10.54.36.36 |04:57:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T327_U20627_M0 | | |10.54.36.25 |04:57:03| |
|norm|2 | | |
0|
|SYNC_RFC |T328_U20630_M0 | | |smprd02.niladv.org |04:57:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T329_U20632_M0 | | |10.50.47.13 |04:57:13| |
|norm|3 | | |
0|
|HTTP_NORMAL |T330_U20639_M0 | | |10.54.36.29 |04:57:48| |
|norm|2 | | |
0|
|SYNC_RFC |T331_U20641_M0 | | |smprd02.niladv.org |04:57:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T332_U20642_M0 | | |10.50.47.10 |04:57:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T333_U20643_M0 | | |10.54.36.34 |04:57:58| |
|norm|2 | | |
0|
|SYNC_RFC |T334_U20692_M0 | | |smprd02.niladv.org |04:59:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T335_U20645_M0 | | |10.54.36.12 |04:58:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T336_U20648_M0 | | |10.54.36.17 |04:58:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T337_U20653_M0 | | |10.54.36.35 |04:58:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T338_U20717_M0 | | |10.54.36.32 |05:00:50| |
|norm|2 | | |
0|
|SYNC_RFC |T339_U20673_M0 | | |smprd02.niladv.org |04:58:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T340_U20677_M0 | | |10.54.36.38 |04:59:02| |
|norm|3 | | |
0|
|SYNC_RFC |T341_U20690_M0 | | |smprd02.niladv.org |04:59:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T342_U20674_M0 | | |10.54.36.19 |04:58:58| |
|norm|9 | | |
0|
|HTTP_NORMAL |T343_U20691_M0 | | |10.54.36.29 |04:59:49| |
|norm|8 | | |
0|
|HTTP_NORMAL |T344_U20676_M0 | | |10.54.36.15 |04:59:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T345_U20679_M0 | | |10.54.36.30 |04:59:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T346_U20662_M0 | | |10.54.36.13 |04:58:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T347_U20664_M0 | | |10.54.36.26 |04:58:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T348_U20723_M0 | | |10.54.36.12 |05:01:01| |
|norm|3 | | |
0|
|SYNC_RFC |T349_U20669_M0 | | |smprd02.niladv.org |04:58:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T350_U20670_M0 | | |10.54.36.32 |04:58:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T351_U20671_M0 | | |10.54.36.27 |04:58:50| |
|norm|3 | | |
0|
|SYNC_RFC |T352_U20716_M0 | | |smprd02.niladv.org |05:00:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T353_U20694_M0 | | |10.54.36.33 |04:59:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T354_U20697_M0 | | |10.54.36.25 |05:00:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T355_U20698_M0 | | |10.54.36.36 |05:00:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T356_U20700_M0 | | |10.50.47.13 |05:00:12| |
|norm|2 | | |
0|
|SYNC_RFC |T357_U20702_M0 | | |smprd02.niladv.org |05:00:14| |
|norm|1 | | |
0|
|SYNC_RFC |T358_U20703_M0 | | |smprd02.niladv.org |05:00:16| |
|norm|1 | | |
0|
|SYNC_RFC |T359_U20704_M0 | | |smprd02.niladv.org |05:00:18| |
|norm|1 | | |
0|
|SYNC_RFC |T360_U20705_M0 | | |smprd02.niladv.org |05:00:20| |
|norm|1 | | |
0|
|SYNC_RFC |T361_U20707_M0 | | |smprd02.niladv.org |05:00:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T362_U20711_M0 | | |10.54.36.37 |05:00:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T363_U20712_M0 | | |10.54.36.14 |05:00:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T364_U20713_M0 | | |10.54.36.29 |05:00:38| |
|norm|10 | | |
0|
|SYNC_RFC |T365_U20714_M0 | | |smprd02.niladv.org |05:00:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T366_U20718_M0 | | |10.54.36.27 |05:00:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T367_U20720_M0 | | |10.50.47.10 |05:00:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T368_U20721_M0 | | |10.54.36.40 |05:00:56| |
|norm|2 | | |
0|
|HTTP_NORMAL |T369_U20722_M0 | | |10.54.36.34 |05:00:57| |
|norm|5 | | |
0|
|HTTP_NORMAL |T370_U20724_M0 | | |10.54.36.15 |05:01:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T371_U20725_M0 | | |10.54.36.17 |05:01:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T372_U20727_M0 | | |10.54.36.38 |05:01:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T373_U20738_M0 | | |10.54.36.35 |05:01:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T374_U20729_M0 | | |10.54.36.30 |05:01:04| |
|norm|2 | | |
0|
|SYNC_RFC |T375_U20731_M0 | | |smprd02.niladv.org |05:01:14| |
|norm|1 | | |
0|
|SYNC_RFC |T376_U20732_M0 | | |smprd02.niladv.org |05:01:16| |
|norm|1 | | |
0|
|SYNC_RFC |T377_U20733_M0 | | |smprd02.niladv.org |05:01:18| |
|norm|1 | | |
0|
|SYNC_RFC |T378_U20734_M0 | | |smprd02.niladv.org |05:01:20| |
|norm|1 | | |
0|
|SYNC_RFC |T379_U20736_M0 | | |smprd02.niladv.org |05:01:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T380_U20739_M0 | | |10.54.36.13 |05:01:30| |
|norm|2 | | |
0|
|HTTP_NORMAL |T381_U20740_M0 | | |10.54.36.26 |05:01:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T382_U20741_M0 | | |10.54.36.28 |05:01:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T383_U20742_M0 | | |10.54.36.11 |05:01:32| |
|norm|2 | | |
0|
|SYNC_RFC |T384_U20876_M0 | | |smprd02.niladv.org |05:05:50| |
|norm|1 | | |
0|
|SYNC_RFC |T385_U20745_M0 | | |smprd02.niladv.org |05:01:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T386_U20750_M0 | | |10.54.36.19 |05:01:57| |
|norm|5 | | |
0|
|HTTP_NORMAL |T387_U20751_M0 | | |10.54.36.33 |05:01:58| |
|norm|2 | | |
0|
|SYNC_RFC |T388_U20752_M0 | | |smprd02.niladv.org |05:02:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T389_U20754_M0 | | |10.54.36.25 |05:02:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T390_U20755_M0 | | |10.54.36.36 |05:02:04| |
|norm|2 | | |
0|
|SYNC_RFC |T391_U20758_M0 | | |smprd02.niladv.org |05:02:11| |
|norm|1 | | |
0|
|SYNC_RFC |T392_U20760_M0 | | |smprd02.niladv.org |05:02:14| |
|norm|1 | | |
0|
|SYNC_RFC |T393_U20761_M0 | | |smprd02.niladv.org |05:02:16| |
|norm|1 | | |
0|
|SYNC_RFC |T394_U20762_M0 | | |smprd02.niladv.org |05:02:18| |
|norm|1 | | |
0|
|SYNC_RFC |T395_U20763_M0 | | |smprd02.niladv.org |05:02:20| |
|norm|1 | | |
0|
|SYNC_RFC |T396_U20764_M0 | | |smprd02.niladv.org |05:02:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T397_U20769_M0 | | |10.54.36.37 |05:02:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T398_U20877_M0 | | |10.54.36.27 |05:05:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T399_U20774_M0 | | |10.54.36.29 |05:02:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T400_U20775_M0 | | |10.54.36.29 |05:02:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T401_U20776_M0 | | |10.54.36.29 |05:02:48| |
|norm|2 | | |
0|
|SYNC_RFC |T402_U20777_M0 | | |smprd02.niladv.org |05:02:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T403_U20778_M0 | | |10.54.36.27 |05:02:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T404_U20779_M0 | | |10.54.36.32 |05:02:50| |
|norm|10 | | |
0|
|HTTP_NORMAL |T405_U20781_M0 | | |10.50.47.10 |05:02:53| |
|norm|2 | | |
0|
|SYNC_RFC |T406_U20782_M0 | | |smprd02.niladv.org |05:02:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T407_U20783_M0 | | |10.54.36.34 |05:02:58| |
|norm|2 | | |
0|
|SYNC_RFC |T408_U20784_M0 | | |smprd02.niladv.org |05:03:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T409_U20785_M0 | | |10.54.36.12 |05:03:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T410_U20786_M0 | | |10.54.36.17 |05:03:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T411_U20788_M0 | | |10.54.36.15 |05:03:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T412_U20789_M0 | | |10.54.36.38 |05:03:04| |
|norm|5 | | |
0|
|HTTP_NORMAL |T413_U20790_M0 | | |10.54.36.30 |05:03:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T414_U20793_M0 | | |10.50.47.13 |05:03:11| |
|norm|4 | | |
0|
|SYNC_RFC |T415_U20795_M0 | | |smprd02.niladv.org |05:03:14| |
|norm|1 | | |
0|
|SYNC_RFC |T416_U20796_M0 | | |smprd02.niladv.org |05:03:16| |
|norm|1 | | |
0|
|SYNC_RFC |T417_U20797_M0 | | |smprd02.niladv.org |05:03:18| |
|norm|1 | | |
0|
|SYNC_RFC |T418_U20798_M0 | | |smprd02.niladv.org |05:03:20| |
|norm|1 | | |
0|
|SYNC_RFC |T419_U20799_M0 | | |smprd02.niladv.org |05:03:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T420_U20803_M0 | | |10.54.36.26 |05:03:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T421_U20804_M0 | | |10.54.36.28 |05:03:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T422_U20805_M0 | | |10.54.36.11 |05:03:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T423_U20807_M0 | | |10.54.36.41 |05:03:39| |
|norm|2 | | |
0|
|SYNC_RFC |T424_U20810_M0 | | |smprd02.niladv.org |05:03:50| |
|norm|1 | | |
0|
|SYNC_RFC |T425_U20812_M0 | | |smprd02.niladv.org |05:03:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T426_U20813_M0 | | |10.54.36.19 |05:03:58| |
|norm|9 | | |
0|
|HTTP_NORMAL |T427_U20814_M0 | | |10.54.36.33 |05:03:58| |
|norm|2 | | |
0|
|SYNC_RFC |T428_U20844_M0 | | |smprd02.niladv.org |05:04:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T429_U20817_M0 | | |10.54.36.25 |05:04:03| |
|norm|3 | | |
0|
|SYNC_RFC |T430_U20818_M0 | | |smprd02.niladv.org |05:04:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T431_U20819_M0 | | |10.54.36.36 |05:04:04| |
|norm|3 | | |
0|
|SYNC_RFC |T432_U20821_M0 | | |smprd02.niladv.org |05:04:10| |
|norm|1 | | |
0|
|SYNC_RFC |T433_U20823_M0 | | |smprd02.niladv.org |05:04:14| |
|norm|1 | | |
0|
|SYNC_RFC |T434_U20824_M0 | | |smprd02.niladv.org |05:04:16| |
|norm|1 | | |
0|
|SYNC_RFC |T435_U20825_M0 | | |smprd02.niladv.org |05:04:18| |
|norm|1 | | |
0|
|SYNC_RFC |T436_U20827_M0 | | |smprd02.niladv.org |05:04:20| |
|norm|1 | | |
0|
|SYNC_RFC |T437_U20828_M0 | | |smprd02.niladv.org |05:04:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T438_U20857_M0 | | |10.50.47.13 |05:05:12| |
|norm|1 | | |
0|
|SYNC_RFC |T439_U20856_M0 | | |smprd02.niladv.org |05:05:10| |
|norm|1 | | |
0|
|SYNC_RFC |T440_U20854_M0 | | |smprd02.niladv.org |05:05:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T441_U20847_M0 | | |10.50.47.10 |05:04:54| |
|norm|2 | | |
0|
|HTTP_NORMAL |T442_U20851_M0 | | |10.54.36.17 |05:05:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T443_U20850_M0 | | |10.54.36.12 |05:05:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T444_U20853_M0 | | |10.54.36.15 |05:05:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T445_U20848_M0 | | |10.54.36.34 |05:04:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T446_U20840_M0 | | |10.54.36.37 |05:04:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T447_U20841_M0 | | |10.54.36.14 |05:06:40|3 |
|norm|2 | | |
0|
|HTTP_NORMAL |T448_U20845_M0 | | |10.54.36.29 |05:04:49| |
|norm|1 | | |
0|
|SYNC_RFC |T449_U20859_M0 | | |smprd02.niladv.org |05:05:14| |
|norm|1 | | |
0|
|SYNC_RFC |T450_U20860_M0 | | |smprd02.niladv.org |05:05:16| |
|norm|1 | | |
0|
|SYNC_RFC |T451_U20861_M0 | | |smprd02.niladv.org |05:05:18| |
|norm|1 | | |
0|
|SYNC_RFC |T452_U20863_M0 | | |smprd02.niladv.org |05:05:20| |
|norm|1 | | |
0|
|SYNC_RFC |T453_U20864_M0 | | |smprd02.niladv.org |05:05:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T454_U20867_M0 | | |10.54.36.35 |05:05:28| |
|norm|1 | | |
0|
|HTTP_NORMAL |T455_U20868_M0 | | |10.54.36.13 |05:05:31| |
|norm|4 | | |
0|
|HTTP_NORMAL |T456_U20869_M0 | | |10.54.36.26 |05:05:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T457_U20870_M0 | | |10.54.36.28 |05:05:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T458_U20872_M0 | | |10.54.36.29 |05:05:38| |
|norm|9 | | |
0|
|HTTP_NORMAL |T459_U20873_M0 | | |10.54.36.41 |05:05:40| |
|norm|1 | | |
0|
|SYNC_RFC |T460_U20874_M0 | | |smprd02.niladv.org |05:05:40| |
|norm|1 | | |
0|
|HTTP_NORMAL |T461_U20878_M0 | | |10.54.36.32 |05:05:51| |
|norm|15 | | |
0|
|HTTP_NORMAL |T462_U20880_M0 | | |10.54.36.19 |05:05:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T463_U20881_M0 | | |10.54.36.33 |05:05:59| |
|norm|1 | | |
0|
|INTERNAL |T464_U20904_M0 | | | |05:06:40|4 |
|high| | | |
0|
|HTTP_NORMAL |T465_U20884_M0 | | |10.54.36.30 |05:06:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T466_U20886_M0 | | |10.54.36.38 |05:06:03| |
|norm|4 | | |
0|
|SYNC_RFC |T467_U20887_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T468_U20888_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T469_U20890_M0 | | |smprd02.niladv.org |05:06:14| |
|norm|1 | | |
0|
|SYNC_RFC |T470_U20891_M0 | | |smprd02.niladv.org |05:06:15| |
|norm|1 | | |
0|
|SYNC_RFC |T471_U20892_M0 | | |smprd02.niladv.org |05:06:17| |
|norm|1 | | |
0|
|SYNC_RFC |T472_U20893_M0 | | |smprd02.niladv.org |05:06:18| |
|norm|1 | | |
0|
|SYNC_RFC |T473_U20896_M0 | | |smprd02.niladv.org |05:06:20| |
|norm|1 | | |
0|
|SYNC_RFC |T474_U20897_M0 | | |smprd02.niladv.org |05:06:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T475_U20899_M0 | | |10.54.36.11 |05:06:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T476_U20901_M0 | | |10.54.36.37 |05:06:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T477_U20902_M0 | | |10.54.36.14 |05:06:37| |
|norm|1 | | |
0|
|SYNC_RFC |T478_U20903_M0 | | |smprd02.niladv.org |05:06:40| |
|norm|1 | | |
0|

Found 478 logons with 478 sessions


Total ES (gross) memory of all sessions: 75 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

Force ABAP stack dump of session T12_U20220_M0


Force ABAP stack dump of session T23_U20175_M0
Force ABAP stack dump of session T38_U20216_M0
Force ABAP stack dump of session T61_U20113_M0

Sun Sep 22 05:06:43:856 2019


Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T121_U20226_M0
Force ABAP stack dump of session T190_U20350_M0
Force ABAP stack dump of session T197_U20347_M0
Force ABAP stack dump of session T200_U20354_M0
Force ABAP stack dump of session T251_U20559_M0
Force ABAP stack dump of session T253_U20470_M0
Force ABAP stack dump of session T255_U20475_M0
Force ABAP stack dump of session T260_U20481_M0
Force ABAP stack dump of session T288_U20578_M0
Force ABAP stack dump of session T313_U20596_M0
Force ABAP stack dump of session T315_U20602_M0
Force ABAP stack dump of session T364_U20713_M0
Force ABAP stack dump of session T404_U20779_M0
Force ABAP stack dump of session T461_U20878_M0

RFC-Connection Table (130 entries) Sun Sep 22 05:06:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 12|53370125|53370125SU20716_M0 |T352_U20716_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 16|53750985|53750985SU20896_M0 |T473_U20896_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 21|53677905|53677905SU20861_M0 |T451_U20861_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 26|54305161|54305161SU20692_M0 |T334_U20692_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 31|53533790|53533790SU20795_M0 |T415_U20795_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 32|53831465|53831465SU20509_M0 |T274_U20509_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 33|54303059|54303059SU20690_M0 |T341_U20690_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 34|53477705|53477705SU20764_M0 |T396_U20764_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 37|53606863|53606863SU20824_M0 |T434_U20824_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 38|53461465|53461465SU20758_M0 |T391_U20758_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 42|53536792|53536792SU20796_M0 |T416_U20796_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 43|53659094|53659094SU20854_M0 |T440_U20854_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 44|54184460|54184460SU20641_M0 |T331_U20641_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 45|53745004|53745004SU20892_M0 |T471_U20892_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 50|53747978|53747978SU20893_M0 |T472_U20893_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 56|53666302|53666302SU20856_M0 |T439_U20856_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 57|53869968|53869968SU20533_M0 |T290_U20533_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 67|54240969|54240969SU20669_M0 |T349_U20669_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 68|53474712|53474712SU20763_M0 |T395_U20763_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 69|53609831|53609831SU20825_M0 |T435_U20825_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 71|53603861|53603861SU20823_M0 |T433_U20823_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 72|53504970|53504970SU20777_M0 |T402_U20777_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 73|53627349|53627349SU20435_M0 |T238_U20435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 76|53740983|53740983SU20890_M0 |T469_U20890_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 81|53539769|53539769SU20797_M0 |T417_U20797_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 82|53542786|53542786SU20798_M0 |T418_U20798_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 84|54055969|54055969SU20600_M0 |T277_U20600_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 85|53753961|53753961SU20897_M0 |T474_U20897_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 90|53335572|53335572SU20704_M0 |T359_U20704_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 96|53683884|53683884SU20864_M0 |T453_U20864_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 98|32252616|32252616SU20586_M0 |T308_U20586_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 99|53683971|53683971SU20456_M0 |T248_U20456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 100|53884618|53884618SU20542_M0 |T279_U20542_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 101|53875871|53875871SU20535_M0 |T281_U20535_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 106|53713330|53713330SU20876_M0 |T384_U20876_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 121|53450513|53450513SU20752_M0 |T388_U20752_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 126|54246530|54246530SU20673_M0 |T339_U20673_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 130|53730608|53730608SU20888_M0 |T468_U20888_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 135|53772320|53772320SU20903_M0 |T478_U20903_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 138|53820547|53820547SU20507_M0 |T273_U20507_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 145|53615810|53615810SU20828_M0 |T437_U20828_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 149|53338570|53338570SU20705_M0 |T360_U20705_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 153|53674932|53674932SU20860_M0 |T450_U20860_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 158|53573044|53573044SU20809_M0 |T293_U20809_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 160|53405636|53405636SU20734_M0 |T378_U20734_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 165|53591025|53591025SU20818_M0 |T430_U20818_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 169|53671936|53671936SU20859_M0 |T449_U20859_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 170|53796298|53796298SU20500_M0 |T271_U20500_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 171|53575295|53575295SU20810_M0 |T424_U20810_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 175|53402644|53402644SU20733_M0 |T377_U20733_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 176|54333571|54333571SU20703_M0 |T358_U20703_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 177|54330571|54330571SU20702_M0 |T357_U20702_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 178|53359275|53359275SU20714_M0 |T365_U20714_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 180|53426328|53426328SU20745_M0 |T385_U20745_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 187|53341583|53341583SU20707_M0 |T361_U20707_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 193|53933040|53933040SU20558_M0 |T75_U20558_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 195|53702262|53702262SU20874_M0 |T460_U20874_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 199|53643123|53643123SU20844_M0 |T428_U20844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 200|53612837|53612837SU20827_M0 |T436_U20827_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 201|53399648|53399648SU20732_M0 |T376_U20732_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 204|53438225|53438225SU20748_M0 |T256_U20748_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 205|53746045|53746045SU20474_M0 |T254_U20474_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 209|53580137|53580137SU20812_M0 |T425_U20812_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 212|53519586|53519586SU20784_M0 |T408_U20784_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 213|53545747|53545747SU20799_M0 |T419_U20799_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 216|54141465|54141465SU20630_M0 |T328_U20630_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 221|54118046|54118046SU20620_M0 |T322_U20620_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 234|54107318|54107318SU20615_M0 |T320_U20615_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 236|53598228|53598228SU20821_M0 |T432_U20821_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 239|53468725|53468725SU20761_M0 |T393_U20761_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 242|53742654|53742654SU20891_M0 |T470_U20891_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 252|53938941|53938941SU20562_M0 |T298_U20562_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 253|53408645|53408645SU20736_M0 |T379_U20736_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 254|53680911|53680911SU20863_M0 |T452_U20863_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 255|53734228|53734228SU20469_M0 |T252_U20469_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 259|53465735|53465735SU20760_M0 |T392_U20760_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 263|54045247|54045247SU20597_M0 |T314_U20597_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 265|53471695|53471695SU20762_M0 |T394_U20762_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 271|53396676|53396676SU20731_M0 |T375_U20731_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 282|53511057|53511057SU20782_M0 |T406_U20782_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 284|53729452|53729452SU20887_M0 |T467_U20887_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 130 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
2 WORKER 19804
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
361 INVALID -1
362 INVALID -1
363 INVALID -1
364 INVALID -1
365 INVALID -1
366 INVALID -1
367 INVALID -1
368 INVALID -1
369 INVALID -1
370 INVALID -1
371 INVALID -1
372 INVALID -1
373 INVALID -1
374 INVALID -1
375 INVALID -1
376 INVALID -1
377 INVALID -1
378 INVALID -1
379 INVALID -1
380 INVALID -1
381 INVALID -1
382 INVALID -1
383 INVALID -1
384 INVALID -1
385 INVALID -1
386 INVALID -1
387 INVALID -1
388 INVALID -1
389 INVALID -1
390 INVALID -1
391 INVALID -1
392 INVALID -1
393 INVALID -1
394 INVALID -1
395 INVALID -1
396 INVALID -1
397 INVALID -1
398 INVALID -1
399 INVALID -1
400 INVALID -1
401 INVALID -1
402 INVALID -1
403 INVALID -1
404 INVALID -1
405 INVALID -1
406 INVALID -1
407 INVALID -1
408 INVALID -1
409 INVALID -1
410 INVALID -1
411 INVALID -1
412 INVALID -1
413 INVALID -1
414 INVALID -1
415 INVALID -1
416 INVALID -1
417 INVALID -1
418 INVALID -1
419 INVALID -1
420 INVALID -1
421 INVALID -1
422 INVALID -1
423 INVALID -1
424 INVALID -1
425 INVALID -1
426 INVALID -1
427 INVALID -1
428 INVALID -1
429 INVALID -1
... skip next entries
100 ca_blk slots of 6000 in use, 97 currently unowned (in request queues)

MPI Info Sun Sep 22 05:06:43 2019


------------------------------------------------------------
Current pipes in use: 203
Current / maximal blocks in use: 217 / 1884
Periodic Tasks Sun Sep 22 05:06:43 2019
------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5712| 77| |
|
| 1|DDLOG | 5712| 77| |
|
| 2|BTCSCHED | 11422| 21| |
|
| 3|RESTART_ALL | 2285| 193| |
|
| 4|ENVCHECK | 34279| 20| |
|
| 5|AUTOABAP | 2285| 193| |
|
| 6|BGRFC_WATCHDOG | 2286| 193| |
|
| 7|AUTOTH | 305| 21| |
|
| 8|AUTOCCMS | 11421| 21| |
|
| 9|AUTOSECURITY | 11420| 21| |
|
| 10|LOAD_CALCULATION | 684635| 1| |
|
| 11|SPOOLALRM | 11426| 21| |
|
| 12|CALL_DELAYED | 0| 521| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 185 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:06:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:06:53:846 2019


DpHdlSoftCancel: cancel request for T448_U20845_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:06:59:537 2019


DpHdlSoftCancel: cancel request for T441_U20847_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T445_U20848_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:07:03:836 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:07:04:154 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:07:05:947 2019


DpHdlSoftCancel: cancel request for T442_U20851_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T443_U20850_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T444_U20853_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:07:15:957 2019


DpHdlSoftCancel: cancel request for T438_U20857_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:07:23:837 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:07:24:176 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:07:33:974 2019


DpHdlSoftCancel: cancel request for T457_U20870_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T454_U20867_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T455_U20868_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T456_U20869_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:07:43:837 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 27295

Sun Sep 22 05:07:43:981 2019


DpHdlSoftCancel: cancel request for T459_U20873_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:07:58:199 2019


DpHdlSoftCancel: cancel request for T398_U20877_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T461_U20878_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:08:03:204 2019


DpHdlSoftCancel: cancel request for T465_U20884_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T462_U20880_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T463_U20881_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:08:03:838 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:08:04:153 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:08:09:213 2019


DpHdlSoftCancel: cancel request for T498_U20938_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T466_U20886_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:08:22:659 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:08:23:839 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 27295 terminated

Sun Sep 22 05:08:33:232 2019


DpHdlSoftCancel: cancel request for T475_U20899_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:08:38:237 2019


DpHdlSoftCancel: cancel request for T499_U20939_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T476_U20901_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T477_U20902_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:08:43:840 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
Sun Sep 22 05:08:44:787 2019
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:08:58:250 2019


DpHdlSoftCancel: cancel request for T300_U20907_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:03:255 2019


DpHdlSoftCancel: cancel request for T464_U20911_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T447_U20910_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T480_U20909_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:03:840 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:09:04:153 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:09:09:808 2019


DpHdlSoftCancel: cancel request for T481_U20912_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T482_U20914_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:17:484 2019


DpHdlSoftCancel: cancel request for T487_U20919_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:23:841 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:09:24:286 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:09:33:492 2019


DpHdlSoftCancel: cancel request for T494_U20930_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T483_U20929_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T495_U20931_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:38:494 2019


DpHdlSoftCancel: cancel request for T496_U20933_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:43:841 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:09:44:332 2019


DpHdlSoftCancel: delete in progress for T523_U20979_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:44:853 2019


DpHdlSoftCancel: delete in progress for T522_U20978_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:09:53:504 2019


DpHdlSoftCancel: cancel request for T497_U20937_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:10:03:842 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-29396
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-29397
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-29398
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-29399
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-29400

Sun Sep 22 05:10:03:993 2019


DpHdlSoftCancel: cancel request for T501_U20942_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:10:04:154 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:10:04:364 2019


DpHdlSoftCancel: delete in progress for T524_U20980_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:10:04:884 2019


DpHdlSoftCancel: delete in progress for T525_U20981_M0, ignore cancel request from
ICMAN (reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
*** ERROR => DpHdlDeadWp: W0 (pid 29396) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29396) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 29396)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 29397) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29397) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 29397)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 29398) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29398) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 29398)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 29399) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=29399) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 29399)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 29400) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W7 (pid = 29400)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
Sun Sep 22 05:10:23:843 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:10:39:020 2019


DpHdlSoftCancel: cancel request for T458_U20872_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:10:43:843 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:10:44:024 2019


DpHdlSoftCancel: cancel request for T509_U20958_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:10:44:401 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:10:52:933 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.918846 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.918819 /
0.000000

Sun Sep 22 05:10:54:034 2019


DpHdlSoftCancel: cancel request for T510_U20962_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T511_U20963_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
Sun Sep 22 05:10:56:936 2019
DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 05:10:59:039 2019


DpHdlSoftCancel: cancel request for T513_U20966_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T512_U20965_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:03:844 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:11:04:041 2019


DpHdlSoftCancel: cancel request for T516_U20969_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T515_U20968_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T517_U20971_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:04:154 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:11:09:046 2019


DpHdlSoftCancel: cancel request for T518_U20974_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:17:594 2019


DpHdlSoftCancel: cancel request for T526_U20982_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:23:845 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:11:24:436 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:11:34:611 2019


DpHdlSoftCancel: cancel request for T532_U20992_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T533_U20994_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T346_U20990_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T208_U20991_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:39:616 2019


DpHdlSoftCancel: cancel request for T534_U20995_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:43:845 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:11:44:453 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:11:49:627 2019


DpHdlSoftCancel: cancel request for T523_U20998_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:54:631 2019


DpHdlSoftCancel: cancel request for T535_U21000_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:11:59:636 2019


DpHdlSoftCancel: cancel request for T536_U21001_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:12:03:846 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:12:04:156 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:12:04:468 2019


DpHdlSoftCancel: cancel request for T538_U21004_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T537_U21002_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:12:20:773 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:12:23:847 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:07:43 2019, skip new
snapshot

Sun Sep 22 05:12:29:493 2019


DpHdlSoftCancel: cancel request for T548_U21020_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:12:34:497 2019


DpHdlSoftCancel: cancel request for T549_U21021_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:12:40:790 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:12:43:848 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 186 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:12:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 05:12:43 2019


Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 2
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 05:12:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 1756 (peak 1758, writeCount 24105509, readCount 24103753)


UPD : 0 (peak 31, writeCount 4960, readCount 4960)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125408, readCount 2125408)
SPO : 0 (peak 2, writeCount 25115, readCount 25115)
UP2 : 0 (peak 1, writeCount 2346, readCount 2346)
DISP: 0 (peak 67, writeCount 890148, readCount 890148)
GW : 0 (peak 49, writeCount 22411022, readCount 22411022)
ICM : 0 (peak 186, writeCount 391103, readCount 391103)
LWP : 6 (peak 16, writeCount 38276, readCount 38270)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 577 elements, peak 578):
-1 <- 579 < T548_U21020_M0> -> 421
579 <- 421 < T390_U20755_M0> -> 418
421 <- 418 < T387_U20751_M0> -> 420
418 <- 420 < T389_U20754_M0> -> 568
420 <- 568 < T537_U21002_M0> -> 569
568 <- 569 < T538_U21004_M0> -> 417
569 <- 417 < T386_U20750_M0> -> 567
417 <- 567 < T536_U21001_M0> -> 566
567 <- 566 < T535_U21000_M0> -> 553
566 <- 553 < T523_U20998_M0> -> 414
553 <- 414 < T383_U20742_M0> -> 413
414 <- 413 < T382_U20741_M0> -> 412
413 <- 412 < T381_U20740_M0> -> 411
412 <- 411 < T380_U20739_M0> -> 404
411 <- 404 < T373_U20738_M0> -> 239
404 <- 239 < T208_U20991_M0> -> 377
239 <- 377 < T346_U20990_M0> -> 564
377 <- 564 < T533_U20994_M0> -> 563
564 <- 563 < T532_U20992_M0> -> 557
563 <- 557 < T526_U20982_M0> -> 405
557 <- 405 < T374_U20729_M0> -> 402
405 <- 402 < T371_U20725_M0> -> 401
402 <- 401 < T370_U20724_M0> -> 403
401 <- 403 < T372_U20727_M0> -> 379
403 <- 379 < T348_U20723_M0> -> 549
379 <- 549 < T518_U20974_M0> -> 398
549 <- 398 < T367_U20720_M0> -> 400
398 <- 400 < T369_U20722_M0> -> 399
400 <- 399 < T368_U20721_M0> -> 548
399 <- 548 < T517_U20971_M0> -> 546
548 <- 546 < T515_U20968_M0> -> 547
546 <- 547 < T516_U20969_M0> -> 543
547 <- 543 < T512_U20965_M0> -> 544
543 <- 544 < T513_U20966_M0> -> 369
544 <- 369 < T338_U20717_M0> -> 397
369 <- 397 < T366_U20718_M0> -> 542
397 <- 542 < T511_U20963_M0> -> 541
542 <- 541 < T510_U20962_M0> -> 395
541 <- 395 < T364_U20713_M0> -> 540
395 <- 540 < T509_U20958_M0> -> 393
540 <- 393 < T362_U20711_M0> -> 394
393 <- 394 < T363_U20712_M0> -> 489
394 <- 489 < T458_U20872_M0> -> 387
489 <- 387 < T356_U20700_M0> -> 386
387 <- 386 < T355_U20698_M0> -> 385
386 <- 385 < T354_U20697_M0> -> 384
385 <- 384 < T353_U20694_M0> -> 532
384 <- 532 < T501_U20942_M0> -> 372
532 <- 372 < T343_U20691_M0> -> 528
372 <- 528 < T497_U20937_M0> -> 245
528 <- 245 < T213_U20687_M0> -> 527
245 <- 527 < T496_U20933_M0> -> 526
527 <- 526 < T495_U20931_M0> -> 514
526 <- 514 < T483_U20929_M0> -> 525
514 <- 525 < T494_U20930_M0> -> 518
525 <- 518 < T487_U20919_M0> -> 373
518 <- 373 < T345_U20679_M0> -> 374
373 <- 374 < T340_U20677_M0> -> 513
374 <- 513 < T482_U20914_M0> -> 512
513 <- 512 < T481_U20912_M0> -> 371
512 <- 371 < T342_U20674_M0> -> 375
371 <- 375 < T344_U20676_M0> -> 511
375 <- 511 < T480_U20909_M0> -> 478
511 <- 478 < T447_U20910_M0> -> 495
478 <- 495 < T464_U20911_M0> -> 382
495 <- 382 < T351_U20671_M0> -> 381
382 <- 381 < T350_U20670_M0> -> 331
381 <- 331 < T300_U20907_M0> -> 378
331 <- 378 < T347_U20664_M0> -> 508
378 <- 508 < T477_U20902_M0> -> 507
508 <- 507 < T476_U20901_M0> -> 530
507 <- 530 < T499_U20939_M0> -> 368
530 <- 368 < T337_U20653_M0> -> 506
368 <- 506 < T475_U20899_M0> -> 367
506 <- 367 < T336_U20648_M0> -> 366
367 <- 366 < T335_U20645_M0> -> 364
366 <- 364 < T333_U20643_M0> -> 497
364 <- 497 < T466_U20886_M0> -> 363
497 <- 363 < T332_U20642_M0> -> 529
363 <- 529 < T498_U20938_M0> -> 494
529 <- 494 < T463_U20881_M0> -> 361
494 <- 361 < T330_U20639_M0> -> 248
361 <- 248 < T214_U20638_M0> -> 493
248 <- 493 < T462_U20880_M0> -> 496
493 <- 496 < T465_U20884_M0> -> 492
496 <- 492 < T461_U20878_M0> -> 429
492 <- 429 < T398_U20877_M0> -> 327
429 <- 327 < T296_U20636_M0> -> 490
327 <- 490 < T459_U20873_M0> -> 487
490 <- 487 < T456_U20869_M0> -> 486
487 <- 486 < T455_U20868_M0> -> 485
486 <- 485 < T454_U20867_M0> -> 488
485 <- 488 < T457_U20870_M0> -> 360
488 <- 360 < T329_U20632_M0> -> 357
360 <- 357 < T326_U20626_M0> -> 358
357 <- 358 < T327_U20627_M0> -> 471
358 <- 471 < T444_U20853_M0> -> 474
471 <- 474 < T443_U20850_M0> -> 473
474 <- 473 < T442_U20851_M0> -> 356
473 <- 356 < T325_U20624_M0> -> 476
356 <- 476 < T441_U20847_M0> -> 355
476 <- 355 < T324_U20622_M0> -> 354
355 <- 354 < T323_U20621_M0> -> 222
354 <- 222 < T191_U20619_M0> -> 479
222 <- 479 < T448_U20845_M0> -> 352
479 <- 352 < T321_U20616_M0> -> 434
352 <- 434 < T404_U20779_M0> -> 346
434 <- 346 < T315_U20602_M0> -> 344
346 <- 344 < T313_U20596_M0> -> 319
344 <- 319 < T288_U20578_M0> -> 291
319 <- 291 < T260_U20481_M0> -> 286
291 <- 286 < T255_U20475_M0> -> 284
286 <- 284 < T253_U20470_M0> -> 282
284 <- 282 < T251_U20559_M0> -> 231
282 <- 231 < T200_U20354_M0> -> 228
231 <- 228 < T197_U20347_M0> -> 221
228 <- 221 < T190_U20350_M0> -> 62
221 <- 62 < T121_U20226_M0> -> 78
62 <- 78 < T88_U20160_M0> -> 69
78 <- 69 < T61_U20113_M0> -> 100
69 <- 100 < T38_U20216_M0> -> 54
100 <- 54 < T23_U20175_M0> -> 89
54 <- 89 < T12_U20220_M0> -> 350
89 <- 350 < T319_U20613_M0> -> 477
350 <- 477 < T446_U20840_M0> -> 348
477 <- 348 < T317_U20605_M0> -> 347
348 <- 347 < T316_U20603_M0> -> 349
347 <- 349 < T318_U20606_M0> -> 460
349 <- 460 < T429_U20817_M0> -> 457
460 <- 457 < T426_U20813_M0> -> 458
457 <- 458 < T427_U20814_M0> -> 462
458 <- 462 < T431_U20819_M0> -> 343
462 <- 343 < T312_U20595_M0> -> 342
343 <- 342 < T311_U20594_M0> -> 454
342 <- 454 < T423_U20807_M0> -> 341
454 <- 341 < T310_U20592_M0> -> 452
341 <- 452 < T421_U20804_M0> -> 451
452 <- 451 < T420_U20803_M0> -> 301
451 <- 301 < T270_U20802_M0> -> 298
301 <- 298 < T267_U20801_M0> -> 453
298 <- 453 < T422_U20805_M0> -> 340
453 <- 340 < T309_U20587_M0> -> 338
340 <- 338 < T307_U20585_M0> -> 445
338 <- 445 < T414_U20793_M0> -> 318
445 <- 318 < T287_U20582_M0> -> 337
318 <- 337 < T306_U20581_M0> -> 336
337 <- 336 < T305_U20580_M0> -> 441
336 <- 441 < T410_U20786_M0> -> 440
441 <- 440 < T409_U20785_M0> -> 442
440 <- 442 < T411_U20788_M0> -> 444
442 <- 444 < T413_U20790_M0> -> 443
444 <- 443 < T412_U20789_M0> -> 436
443 <- 436 < T405_U20781_M0> -> 438
436 <- 438 < T407_U20783_M0> -> 435
438 <- 435 < T403_U20778_M0> -> 430
435 <- 430 < T399_U20774_M0> -> 334
430 <- 334 < T303_U20574_M0> -> 335
334 <- 335 < T304_U20575_M0> -> 428
335 <- 428 < T397_U20769_M0> -> 333
428 <- 333 < T302_U20568_M0> -> 332
333 <- 332 < T301_U20567_M0> -> 330
332 <- 330 < T299_U20563_M0> -> 328
330 <- 328 < T297_U20560_M0> -> 325
328 <- 325 < T294_U20553_M0> -> 326
325 <- 326 < T295_U20554_M0> -> 431
326 <- 431 < T400_U20775_M0> -> 309
431 <- 309 < T280_U20543_M0> -> 322
309 <- 322 < T291_U20546_M0> -> 312
322 <- 312 < T283_U20541_M0> -> 310
312 <- 310 < T282_U20544_M0> -> 323
310 <- 323 < T292_U20547_M0> -> 308
323 <- 308 < T276_U20537_M0> -> 314
308 <- 314 < T278_U20536_M0> -> 432
314 <- 432 < T401_U20776_M0> -> 277
432 <- 277 < T246_U20530_M0> -> 320
277 <- 320 < T289_U20532_M0> -> 317
320 <- 317 < T286_U20526_M0> -> 316
317 <- 316 < T285_U20525_M0> -> 315
316 <- 315 < T284_U20523_M0> -> 306
315 <- 306 < T275_U20514_M0> -> 276
306 <- 276 < T245_U20504_M0> -> 303
276 <- 303 < T272_U20501_M0> -> 300
303 <- 300 < T269_U20497_M0> -> 299
300 <- 299 < T268_U20492_M0> -> 297
299 <- 297 < T266_U20489_M0> -> 296
297 <- 296 < T265_U20488_M0> -> 292
296 <- 292 < T261_U20483_M0> -> 294
292 <- 294 < T263_U20485_M0> -> 295
294 <- 295 < T264_U20486_M0> -> 293
295 <- 293 < T262_U20484_M0> -> 290
293 <- 290 < T259_U20480_M0> -> 289
290 <- 289 < T258_U20479_M0> -> 288
289 <- 288 < T257_U20478_M0> -> 281
288 <- 281 < T250_U20467_M0> -> 274
281 <- 274 < T243_U20466_M0> -> 275
274 <- 275 < T244_U20464_M0> -> 280
275 <- 280 < T249_U20458_M0> -> 271
280 <- 271 < T240_U20455_M0> -> 135
271 <- 135 < T71_U20451_M0> -> 278
135 <- 278 < T247_U20447_M0> -> 273
278 <- 273 < T242_U20441_M0> -> 272
273 <- 272 < T241_U20440_M0> -> 241
272 <- 241 < T210_U20438_M0> -> 270
241 <- 270 < T239_U20436_M0> -> 268
270 <- 268 < T237_U20433_M0> -> 258
268 <- 258 < T227_U20432_M0> -> 267
258 <- 267 < T236_U20429_M0> -> 265
267 <- 265 < T234_U20427_M0> -> 266
265 <- 266 < T235_U20428_M0> -> 264
266 <- 264 < T233_U20426_M0> -> 263
264 <- 263 < T232_U20424_M0> -> 262
263 <- 262 < T231_U20419_M0> -> 261
262 <- 261 < T230_U20417_M0> -> 260
261 <- 260 < T229_U20416_M0> -> 229
260 <- 229 < T198_U20411_M0> -> 259
229 <- 259 < T228_U20415_M0> -> 257
259 <- 257 < T226_U20412_M0> -> 119
257 <- 119 < T11_U20406_M0> -> 242
119 <- 242 < T212_U20408_M0> -> 247
242 <- 247 < T211_U20403_M0> -> 246
247 <- 246 < T215_U20400_M0> -> 255
246 <- 255 < T224_U20389_M0> -> 249
255 <- 249 < T216_U20391_M0> -> 253
249 <- 253 < T222_U20386_M0> -> 252
253 <- 252 < T221_U20383_M0> -> 240
252 <- 240 < T209_U20370_M0> -> 250
240 <- 250 < T219_U20381_M0> -> 237
250 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 236
238 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 230
233 <- 230 < T199_U20353_M0> -> 192
230 <- 192 < T161_U20349_M0> -> 226
192 <- 226 < T195_U20345_M0> -> 225
226 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 220
224 <- 220 < T189_U20331_M0> -> 219
220 <- 219 < T188_U20326_M0> -> 218
219 <- 218 < T187_U20325_M0> -> 213
218 <- 213 < T182_U20314_M0> -> 216
213 <- 216 < T185_U20317_M0> -> 212
216 <- 212 < T181_U20313_M0> -> 215
212 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 210
211 <- 210 < T179_U20309_M0> -> 206
210 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 205
208 <- 205 < T174_U20300_M0> -> 203
205 <- 203 < T172_U20298_M0> -> 204
203 <- 204 < T173_U20299_M0> -> 202
204 <- 202 < T171_U20295_M0> -> 201
202 <- 201 < T170_U20291_M0> -> 200
201 <- 200 < T169_U20289_M0> -> 199
200 <- 199 < T168_U20288_M0> -> 196
199 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 103
195 <- 103 < T145_U20282_M0> -> 193
103 <- 193 < T162_U20279_M0> -> 91
193 <- 91 < T19_U20278_M0> -> 189
91 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 182
191 <- 182 < T151_U20265_M0> -> 176
182 <- 176 < T157_U20264_M0> -> 190
176 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 184
158 <- 184 < T135_U20255_M0> -> 180
184 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 188
154 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 160
131 <- 160 < T148_U20238_M0> -> 175
160 <- 175 < T144_U20241_M0> -> 183
175 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 146
152 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 76
166 <- 76 < T20_U20221_M0> -> 138
76 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 56
74 <- 56 < T35_U20203_M0> -> 59
56 <- 59 < T25_U20202_M0> -> 45
59 <- 45 < T76_U20199_M0> -> 39
45 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 148
171 <- 148 < T123_U20185_M0> -> 153
148 <- 153 < T129_U20184_M0> -> 147
153 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 129
170 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 58
144 <- 58 < T77_U20172_M0> -> 117
58 <- 117 < T31_U20168_M0> -> 60
117 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 163
124 <- 163 < T133_U20159_M0> -> 70
163 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 118
114 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 125
105 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 94
42 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 93
167 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 31
132 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 92
43 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 32
113 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 79
109 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 50
73 <- 50 < T80_U19969_M0> -> 120
50 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 251
227 <- 251 < T220_U20382_M0> -> 254
251 <- 254 < T223_U20387_M0> -> 244
254 <- 244 < T218_U20394_M0> -> 243
244 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 269
256 <- 269 < T238_U20435_M0> -> 279
269 <- 279 < T248_U20456_M0> -> 283
279 <- 283 < T252_U20469_M0> -> 285
283 <- 285 < T254_U20474_M0> -> 302
285 <- 302 < T271_U20500_M0> -> 304
302 <- 304 < T273_U20507_M0> -> 305
304 <- 305 < T274_U20509_M0> -> 321
305 <- 321 < T290_U20533_M0> -> 313
321 <- 313 < T281_U20535_M0> -> 311
313 <- 311 < T279_U20542_M0> -> 165
311 <- 165 < T75_U20558_M0> -> 329
165 <- 329 < T298_U20562_M0> -> 339
329 <- 339 < T308_U20586_M0> -> 345
339 <- 345 < T314_U20597_M0> -> 307
345 <- 307 < T277_U20600_M0> -> 351
307 <- 351 < T320_U20615_M0> -> 353
351 <- 353 < T322_U20620_M0> -> 359
353 <- 359 < T328_U20630_M0> -> 362
359 <- 362 < T331_U20641_M0> -> 380
362 <- 380 < T349_U20669_M0> -> 376
380 <- 376 < T339_U20673_M0> -> 370
376 <- 370 < T341_U20690_M0> -> 365
370 <- 365 < T334_U20692_M0> -> 388
365 <- 388 < T357_U20702_M0> -> 389
388 <- 389 < T358_U20703_M0> -> 390
389 <- 390 < T359_U20704_M0> -> 391
390 <- 391 < T360_U20705_M0> -> 392
391 <- 392 < T361_U20707_M0> -> 396
392 <- 396 < T365_U20714_M0> -> 383
396 <- 383 < T352_U20716_M0> -> 406
383 <- 406 < T375_U20731_M0> -> 407
406 <- 407 < T376_U20732_M0> -> 408
407 <- 408 < T377_U20733_M0> -> 409
408 <- 409 < T378_U20734_M0> -> 410
409 <- 410 < T379_U20736_M0> -> 416
410 <- 416 < T385_U20745_M0> -> 287
416 <- 287 < T256_U20748_M0> -> 419
287 <- 419 < T388_U20752_M0> -> 422
419 <- 422 < T391_U20758_M0> -> 423
422 <- 423 < T392_U20760_M0> -> 424
423 <- 424 < T393_U20761_M0> -> 425
424 <- 425 < T394_U20762_M0> -> 426
425 <- 426 < T395_U20763_M0> -> 427
426 <- 427 < T396_U20764_M0> -> 433
427 <- 433 < T402_U20777_M0> -> 437
433 <- 437 < T406_U20782_M0> -> 439
437 <- 439 < T408_U20784_M0> -> 446
439 <- 446 < T415_U20795_M0> -> 447
446 <- 447 < T416_U20796_M0> -> 448
447 <- 448 < T417_U20797_M0> -> 449
448 <- 449 < T418_U20798_M0> -> 450
449 <- 450 < T419_U20799_M0> -> 324
450 <- 324 < T293_U20809_M0> -> 455
324 <- 455 < T424_U20810_M0> -> 456
455 <- 456 < T425_U20812_M0> -> 461
456 <- 461 < T430_U20818_M0> -> 463
461 <- 463 < T432_U20821_M0> -> 464
463 <- 464 < T433_U20823_M0> -> 465
464 <- 465 < T434_U20824_M0> -> 466
465 <- 466 < T435_U20825_M0> -> 467
466 <- 467 < T436_U20827_M0> -> 468
467 <- 468 < T437_U20828_M0> -> 459
468 <- 459 < T428_U20844_M0> -> 472
459 <- 472 < T440_U20854_M0> -> 470
472 <- 470 < T439_U20856_M0> -> 480
470 <- 480 < T449_U20859_M0> -> 481
480 <- 481 < T450_U20860_M0> -> 482
481 <- 482 < T451_U20861_M0> -> 483
482 <- 483 < T452_U20863_M0> -> 484
483 <- 484 < T453_U20864_M0> -> 491
484 <- 491 < T460_U20874_M0> -> 415
491 <- 415 < T384_U20876_M0> -> 498
415 <- 498 < T467_U20887_M0> -> 499
498 <- 499 < T468_U20888_M0> -> 500
499 <- 500 < T469_U20890_M0> -> 501
500 <- 501 < T470_U20891_M0> -> 502
501 <- 502 < T471_U20892_M0> -> 503
502 <- 503 < T472_U20893_M0> -> 504
503 <- 504 < T473_U20896_M0> -> 505
504 <- 505 < T474_U20897_M0> -> 509
505 <- 509 < T478_U20903_M0> -> 510
509 <- 510 < T479_U20908_M0> -> 515
510 <- 515 < T484_U20916_M0> -> 516
515 <- 516 < T485_U20917_M0> -> 517
516 <- 517 < T486_U20918_M0> -> 519
517 <- 519 < T488_U20921_M0> -> 520
519 <- 520 < T489_U20922_M0> -> 521
520 <- 521 < T490_U20923_M0> -> 522
521 <- 522 < T491_U20924_M0> -> 523
522 <- 523 < T492_U20926_M0> -> 524
523 <- 524 < T493_U20927_M0> -> 475
524 <- 475 < T445_U20936_M0> -> 531
475 <- 531 < T500_U20941_M0> -> 533
531 <- 533 < T502_U20946_M0> -> 534
533 <- 534 < T503_U20947_M0> -> 535
534 <- 535 < T504_U20949_M0> -> 536
535 <- 536 < T505_U20950_M0> -> 537
536 <- 537 < T506_U20951_M0> -> 538
537 <- 538 < T507_U20952_M0> -> 539
538 <- 539 < T508_U20953_M0> -> 469
539 <- 469 < T438_U20961_M0> -> 545
469 <- 545 < T514_U20967_M0> -> 550
545 <- 550 < T519_U20975_M0> -> 551
550 <- 551 < T520_U20976_M0> -> 552
551 <- 552 < T521_U20977_M0> -> 558
552 <- 558 < T527_U20984_M0> -> 559
558 <- 559 < T528_U20985_M0> -> 560
559 <- 560 < T529_U20986_M0> -> 561
560 <- 561 < T530_U20987_M0> -> 562
561 <- 562 < T531_U20988_M0> -> 554
562 <- 554 < T522_U20997_M0> -> 570
554 <- 570 < T539_U21007_M0> -> 571
570 <- 571 < T540_U21008_M0> -> 572
571 <- 572 < T541_U21009_M0> -> 573
572 <- 573 < T542_U21010_M0> -> 574
573 <- 574 < T543_U21012_M0> -> 575
574 <- 575 < T544_U21013_M0> -> 576
575 <- 576 < T545_U21014_M0> -> 577
576 <- 577 < T546_U21015_M0> -> 578
577 <- 578 < T547_U21016_M0> -> 589
578 <- 589 < T558_U21031_M0> -> 590
589 <- 590 < T559_U21035_M0> -> 591
590 <- 591 < T560_U21036_M0> -> 592
591 <- 592 < T561_U21037_M0> -> 588
592 <- 588 < T557_U21039_M0> -> 587
588 <- 587 < T554_U21040_M0> -> 584
587 <- 584 < T553_U21041_M0> -> 586
584 <- 586 < T555_U21042_M0> -> 585
586 <- 585 < T556_U21043_M0> -> 582
585 <- 582 < T551_U21047_M0> -> 583
582 <- 583 < T552_U21048_M0> -> 581
583 <- 581 < T550_U21049_M0> -> 593
581 <- 593 < T562_U21050_M0> -> 594
593 <- 594 < T563_U21051_M0> -> 595
594 <- 595 < T564_U21053_M0> -> 596
595 <- 596 < T565_U21054_M0> -> 597
596 <- 597 < T566_U21055_M0> -> 598
597 <- 598 < T567_U21056_M0> -> 599
598 <- 599 < T568_U21057_M0> -> 600
599 <- 600 < T569_U21061_M0> -> 601
600 <- 601 < T570_U21062_M0> -> 602
601 <- 602 < T571_U21063_M0> -> 603
602 <- 603 < T572_U21065_M0> -> 604
603 <- 604 < T573_U21066_M0> -> 605
604 <- 605 < T574_U21067_M0> -> 556
605 <- 556 < T525_U21069_M0> -> 555
556 <- 555 < T524_U21071_M0> -> 606
555 <- 606 < T575_U21072_M0> -> 607
606 <- 607 < T576_U21073_M0> -> 609
607 <- 609 < T578_U21075_M0> -> 610
609 <- 610 < T579_U21076_M0> -> 611
610 <- 611 < T580_U21079_M0> -> 612
611 <- 612 < T581_U21080_M0> -> 613
612 <- 613 < T582_U21081_M0> -> 614
613 <- 614 < T583_U21082_M0> -> 615
614 <- 615 < T584_U21084_M0> -> 616
615 <- 616 < T585_U21085_M0> -> 617
616 <- 617 < T586_U21086_M0> -> 618
617 <- 618 < T587_U21088_M0> -> 619
618 <- 619 < T588_U21089_M0> -> 620
619 <- 620 < T589_U21092_M0> -> 621
620 <- 621 < T590_U21094_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W0> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W2> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W5> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W6> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W7> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (13 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (13 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (14 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (26 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (13 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (15 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20451_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T156_U20224_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (19 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20619_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (14 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T198_U20411_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T199_U20353_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (19 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20991_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20438_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T212_U20408_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20687_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T215_U20400_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T211_U20403_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T214_U20638_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T216_U20391_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T219_U20381_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T227_U20432_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T228_U20415_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T229_U20416_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T230_U20417_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T231_U20419_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T232_U20424_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T233_U20426_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T234_U20427_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T235_U20428_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T236_U20429_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T237_U20433_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T238_U20435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T239_U20436_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T240_U20455_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T241_U20440_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T242_U20441_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T243_U20466_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T244_U20464_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T245_U20504_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T246_U20530_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T247_U20447_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T248_U20456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T249_U20458_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T250_U20467_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T251_U20559_M0> (18 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T252_U20469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T253_U20470_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T254_U20474_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T255_U20475_M0> (18 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T256_U20748_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T257_U20478_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T258_U20479_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T259_U20480_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T260_U20481_M0> (18 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T261_U20483_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T262_U20484_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T263_U20485_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T264_U20486_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T265_U20488_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T266_U20489_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T267_U20801_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T268_U20492_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T269_U20497_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T270_U20802_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T271_U20500_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T272_U20501_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T273_U20507_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T274_U20509_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T275_U20514_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T277_U20600_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T276_U20537_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T280_U20543_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T282_U20544_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T279_U20542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T283_U20541_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T281_U20535_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T278_U20536_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T284_U20523_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T285_U20525_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T286_U20526_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T287_U20582_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T288_U20578_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T289_U20532_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T290_U20533_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T291_U20546_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T292_U20547_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T293_U20809_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T294_U20553_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T295_U20554_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T296_U20636_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T297_U20560_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T298_U20562_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T299_U20563_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T300_U20907_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T301_U20567_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T302_U20568_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T303_U20574_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T304_U20575_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T305_U20580_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T306_U20581_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T307_U20585_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T308_U20586_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T309_U20587_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T310_U20592_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T311_U20594_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T312_U20595_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T313_U20596_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T314_U20597_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T315_U20602_M0> (17 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T316_U20603_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T317_U20605_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T318_U20606_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T319_U20613_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T320_U20615_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T321_U20616_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T322_U20620_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T323_U20621_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T324_U20622_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T325_U20624_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T326_U20626_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T327_U20627_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T328_U20630_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T329_U20632_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T330_U20639_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T331_U20641_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T332_U20642_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T333_U20643_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T334_U20692_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T335_U20645_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T336_U20648_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T337_U20653_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T338_U20717_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T341_U20690_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T342_U20674_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T343_U20691_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T345_U20679_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T340_U20677_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T344_U20676_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T339_U20673_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T346_U20990_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T347_U20664_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T348_U20723_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T349_U20669_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T350_U20670_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T351_U20671_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T352_U20716_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T353_U20694_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T354_U20697_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T355_U20698_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T356_U20700_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T357_U20702_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T358_U20703_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T359_U20704_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T360_U20705_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T361_U20707_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T362_U20711_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T363_U20712_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T364_U20713_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T365_U20714_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T366_U20718_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T367_U20720_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T368_U20721_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T369_U20722_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T370_U20724_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T371_U20725_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T372_U20727_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T373_U20738_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T374_U20729_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T375_U20731_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T376_U20732_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T377_U20733_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T378_U20734_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T379_U20736_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T380_U20739_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T381_U20740_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T382_U20741_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T383_U20742_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T384_U20876_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T385_U20745_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T386_U20750_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T387_U20751_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T388_U20752_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T389_U20754_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T390_U20755_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T391_U20758_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T392_U20760_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T393_U20761_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T394_U20762_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T395_U20763_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T396_U20764_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T397_U20769_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T398_U20877_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T399_U20774_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T400_U20775_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T401_U20776_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T402_U20777_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T404_U20779_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T403_U20778_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T405_U20781_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T406_U20782_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T407_U20783_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T408_U20784_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T409_U20785_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T410_U20786_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T411_U20788_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T412_U20789_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T413_U20790_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T414_U20793_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T415_U20795_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T416_U20796_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T417_U20797_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T418_U20798_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T419_U20799_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T420_U20803_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T421_U20804_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T422_U20805_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T423_U20807_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T424_U20810_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T425_U20812_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T426_U20813_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T427_U20814_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T428_U20844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T429_U20817_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T430_U20818_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T431_U20819_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T432_U20821_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T433_U20823_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T434_U20824_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T435_U20825_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T436_U20827_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T437_U20828_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T438_U20961_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T439_U20856_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T444_U20853_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T440_U20854_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T442_U20851_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T443_U20850_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T445_U20936_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T441_U20847_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T446_U20840_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T447_U20910_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T448_U20845_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T449_U20859_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T450_U20860_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T451_U20861_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T452_U20863_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T453_U20864_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T454_U20867_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T455_U20868_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T456_U20869_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T457_U20870_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T458_U20872_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T459_U20873_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T460_U20874_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T461_U20878_M0> (17 requests):
- 15 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T462_U20880_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T463_U20881_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T464_U20911_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T465_U20884_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T466_U20886_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T467_U20887_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T468_U20888_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T469_U20890_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T470_U20891_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T471_U20892_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T472_U20893_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T473_U20896_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T474_U20897_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T475_U20899_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T476_U20901_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T477_U20902_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T478_U20903_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T479_U20908_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T480_U20909_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T481_U20912_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T482_U20914_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T483_U20929_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T484_U20916_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T485_U20917_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T486_U20918_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T487_U20919_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T488_U20921_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T489_U20922_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T490_U20923_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T491_U20924_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T492_U20926_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T493_U20927_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T494_U20930_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T495_U20931_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T496_U20933_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T497_U20937_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T498_U20938_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T499_U20939_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T500_U20941_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T501_U20942_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T502_U20946_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T503_U20947_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T504_U20949_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T505_U20950_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T506_U20951_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T507_U20952_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T508_U20953_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T509_U20958_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T510_U20962_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T511_U20963_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T512_U20965_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T513_U20966_M0> (19 requests):
- 18 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T514_U20967_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T515_U20968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T516_U20969_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T517_U20971_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T518_U20974_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T519_U20975_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T520_U20976_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T521_U20977_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T523_U20998_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T522_U20997_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T524_U21071_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T525_U21069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T526_U20982_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T527_U20984_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T528_U20985_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T529_U20986_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T530_U20987_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T531_U20988_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T532_U20992_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T533_U20994_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T535_U21000_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T536_U21001_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T537_U21002_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T538_U21004_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T539_U21007_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T540_U21008_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T541_U21009_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T542_U21010_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T543_U21012_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T544_U21013_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T545_U21014_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T546_U21015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T547_U21016_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T548_U21020_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T549_U21021_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T550_U21049_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T551_U21047_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T552_U21048_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T553_U21041_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T556_U21043_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T555_U21042_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T554_U21040_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T557_U21039_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T558_U21031_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T559_U21035_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T560_U21036_M0> (8 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T561_U21037_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T562_U21050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T563_U21051_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T564_U21053_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T565_U21054_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T566_U21055_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T567_U21056_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T568_U21057_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T569_U21061_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T570_U21062_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T571_U21063_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T572_U21065_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T573_U21066_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T574_U21067_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T575_U21072_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T576_U21073_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T578_U21075_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T579_U21076_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T580_U21079_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T581_U21080_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T582_U21081_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T583_U21082_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T584_U21084_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T585_U21085_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T586_U21086_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T587_U21088_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T588_U21089_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T589_U21092_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T590_U21094_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 05:12:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 186
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |9 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |215|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10478|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |norm|T549_U21021_M0 |HTTP_NORM| | |
3| | | | |
|
| 4|19804 |DIA |WP_RUN | | |high|T534_U21091_M0 |INTERNAL | | |
9| |000|SAPSYS | |
|
| 5| |DIA |WP_KILL| |9 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |10 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |9 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 05:12:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|13 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|13 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|3 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|15 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|14 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T71_U20451_M0 | | |10.54.36.35 |04:49:29| |
|norm|3 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|3 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|SYNC_RFC |T75_U20558_M0 | | |smprd02.niladv.org |04:53:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|26 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|3 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|13 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|3 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|3 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|3 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|3 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|3 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|3 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|4 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|4 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|3 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|3 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|19 | | |
0|
|HTTP_NORMAL |T191_U20619_M0 | | |10.54.36.29 |04:56:48| |
|norm|3 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|4 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|14 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|19 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|4 | | |
0|
|HTTP_NORMAL |T208_U20991_M0 | | |10.54.36.11 |05:09:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T210_U20438_M0 | | |10.54.36.15 |04:49:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|6 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T213_U20687_M0 | | |10.54.36.11 |04:59:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T214_U20638_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|3 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|3 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|4 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|6 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|4 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T227_U20432_M0 | | |10.54.36.32 |04:48:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T237_U20433_M0 | | |10.54.36.27 |04:48:50| |
|norm|3 | | |
0|
|SYNC_RFC |T238_U20435_M0 | | |smprd02.niladv.org |04:48:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T239_U20436_M0 | | |10.50.47.10 |04:48:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T240_U20455_M0 | | |10.54.36.29 |04:49:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T241_U20440_M0 | | |10.54.36.25 |04:49:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T242_U20441_M0 | | |10.54.36.36 |04:49:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T243_U20466_M0 | | |10.54.36.37 |04:50:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T244_U20464_M0 | | |10.54.36.13 |04:50:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T245_U20504_M0 | | |10.54.36.33 |04:51:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T246_U20530_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T247_U20447_M0 | | |10.50.47.13 |04:49:12| |
|norm|4 | | |
0|
|SYNC_RFC |T248_U20456_M0 | | |smprd02.niladv.org |04:49:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T249_U20458_M0 | | |10.54.36.33 |04:49:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T250_U20467_M0 | | |10.54.36.28 |04:50:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T251_U20559_M0 | | |10.54.36.32 |04:53:50| |
|norm|18 | | |
0|
|SYNC_RFC |T252_U20469_M0 | | |smprd02.niladv.org |04:50:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T253_U20470_M0 | | |10.54.36.29 |04:50:38| |
|norm|13 | | |
0|
|SYNC_RFC |T254_U20474_M0 | | |smprd02.niladv.org |04:50:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T255_U20475_M0 | | |10.54.36.32 |04:50:50| |
|norm|18 | | |
0|
|SYNC_RFC |T256_U20748_M0 | | |smprd02.niladv.org |05:01:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T257_U20478_M0 | | |10.50.47.10 |04:50:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T258_U20479_M0 | | |10.54.36.40 |04:50:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T259_U20480_M0 | | |10.54.36.34 |04:50:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T260_U20481_M0 | | |10.54.36.19 |04:50:58| |
|norm|18 | | |
0|
|HTTP_NORMAL |T261_U20483_M0 | | |10.54.36.15 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T262_U20484_M0 | | |10.54.36.12 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T263_U20485_M0 | | |10.54.36.17 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T264_U20486_M0 | | |10.54.36.38 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T265_U20488_M0 | | |10.54.36.25 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T266_U20489_M0 | | |10.54.36.30 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T267_U20801_M0 | | |10.54.36.35 |05:03:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T268_U20492_M0 | | |10.50.47.13 |04:51:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T269_U20497_M0 | | |10.54.36.26 |04:51:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T270_U20802_M0 | | |10.54.36.13 |05:03:30| |
|norm|3 | | |
0|
|SYNC_RFC |T271_U20500_M0 | | |smprd02.niladv.org |04:51:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T272_U20501_M0 | | |10.54.36.41 |04:51:39| |
|norm|3 | | |
0|
|SYNC_RFC |T273_U20507_M0 | | |smprd02.niladv.org |04:52:01| |
|norm|1 | | |
0|
|SYNC_RFC |T274_U20509_M0 | | |smprd02.niladv.org |04:52:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T275_U20514_M0 | | |10.54.36.35 |04:52:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T276_U20537_M0 | | |10.54.36.34 |04:52:58| |
|norm|3 | | |
0|
|SYNC_RFC |T277_U20600_M0 | | |smprd02.niladv.org |04:55:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T278_U20536_M0 | | |10.50.47.10 |04:52:53| |
|norm|3 | | |
0|
|SYNC_RFC |T279_U20542_M0 | | |smprd02.niladv.org |04:53:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T280_U20543_M0 | | |10.54.36.12 |04:53:01| |
|norm|6 | | |
0|
|SYNC_RFC |T281_U20535_M0 | | |smprd02.niladv.org |04:52:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T282_U20544_M0 | | |10.54.36.17 |04:53:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T283_U20541_M0 | | |10.54.36.15 |04:53:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T284_U20523_M0 | | |10.54.36.13 |04:52:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T285_U20525_M0 | | |10.54.36.37 |04:52:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T286_U20526_M0 | | |10.54.36.28 |04:52:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T287_U20582_M0 | | |10.54.36.12 |04:55:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T288_U20578_M0 | | |10.54.36.29 |04:54:48| |
|norm|11 | | |
0|
|HTTP_NORMAL |T289_U20532_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|SYNC_RFC |T290_U20533_M0 | | |smprd02.niladv.org |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T291_U20546_M0 | | |10.54.36.38 |04:53:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T292_U20547_M0 | | |10.54.36.30 |04:53:05| |
|norm|4 | | |
0|
|SYNC_RFC |T293_U20809_M0 | | |smprd02.niladv.org |05:03:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T294_U20553_M0 | | |10.54.36.26 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T295_U20554_M0 | | |10.54.36.11 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T296_U20636_M0 | | |10.54.36.14 |04:57:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T297_U20560_M0 | | |10.54.36.27 |04:53:50| |
|norm|5 | | |
0|
|SYNC_RFC |T298_U20562_M0 | | |smprd02.niladv.org |04:53:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T299_U20563_M0 | | |10.54.36.33 |04:53:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T300_U20907_M0 | | |10.50.47.10 |05:06:54| |
|norm|2 | | |
0|
|HTTP_NORMAL |T301_U20567_M0 | | |10.54.36.25 |04:54:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T302_U20568_M0 | | |10.54.36.36 |04:54:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T303_U20574_M0 | | |10.54.36.37 |04:54:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T304_U20575_M0 | | |10.54.36.14 |04:54:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T305_U20580_M0 | | |10.50.47.10 |04:54:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T306_U20581_M0 | | |10.54.36.34 |04:54:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T307_U20585_M0 | | |10.54.36.17 |04:55:05| |
|norm|3 | | |
0|
|ASYNC_RFC |T308_U20586_M0 | | |10.54.36.10 |04:55:10| |
|norm|1 | | |
0|
|HTTP_NORMAL |T309_U20587_M0 | | |10.50.47.13 |04:55:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T310_U20592_M0 | | |10.54.36.13 |04:55:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T311_U20594_M0 | | |10.54.36.28 |04:55:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T312_U20595_M0 | | |10.54.36.26 |04:55:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T313_U20596_M0 | | |10.54.36.29 |04:55:38| |
|norm|13 | | |
0|
|SYNC_RFC |T314_U20597_M0 | | |smprd02.niladv.org |04:55:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T315_U20602_M0 | | |10.54.36.19 |04:55:58| |
|norm|17 | | |
0|
|HTTP_NORMAL |T316_U20603_M0 | | |10.54.36.15 |04:56:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T317_U20605_M0 | | |10.54.36.30 |04:56:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T318_U20606_M0 | | |10.54.36.38 |04:56:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T319_U20613_M0 | | |10.54.36.11 |04:56:32| |
|norm|3 | | |
0|
|SYNC_RFC |T320_U20615_M0 | | |smprd02.niladv.org |04:56:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T321_U20616_M0 | | |10.54.36.41 |04:56:40| |
|norm|3 | | |
0|
|SYNC_RFC |T322_U20620_M0 | | |smprd02.niladv.org |04:56:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T323_U20621_M0 | | |10.54.36.32 |04:56:50| |
|norm|10 | | |
0|
|HTTP_NORMAL |T324_U20622_M0 | | |10.54.36.27 |04:56:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T325_U20624_M0 | | |10.54.36.33 |04:56:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T326_U20626_M0 | | |10.54.36.36 |04:57:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T327_U20627_M0 | | |10.54.36.25 |04:57:03| |
|norm|3 | | |
0|
|SYNC_RFC |T328_U20630_M0 | | |smprd02.niladv.org |04:57:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T329_U20632_M0 | | |10.50.47.13 |04:57:13| |
|norm|4 | | |
0|
|HTTP_NORMAL |T330_U20639_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|SYNC_RFC |T331_U20641_M0 | | |smprd02.niladv.org |04:57:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T332_U20642_M0 | | |10.50.47.10 |04:57:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T333_U20643_M0 | | |10.54.36.34 |04:57:58| |
|norm|3 | | |
0|
|SYNC_RFC |T334_U20692_M0 | | |smprd02.niladv.org |04:59:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T335_U20645_M0 | | |10.54.36.12 |04:58:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T336_U20648_M0 | | |10.54.36.17 |04:58:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T337_U20653_M0 | | |10.54.36.35 |04:58:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T338_U20717_M0 | | |10.54.36.32 |05:00:50| |
|norm|3 | | |
0|
|SYNC_RFC |T339_U20673_M0 | | |smprd02.niladv.org |04:58:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T340_U20677_M0 | | |10.54.36.38 |04:59:02| |
|norm|4 | | |
0|
|SYNC_RFC |T341_U20690_M0 | | |smprd02.niladv.org |04:59:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T342_U20674_M0 | | |10.54.36.19 |04:58:58| |
|norm|10 | | |
0|
|HTTP_NORMAL |T343_U20691_M0 | | |10.54.36.29 |04:59:49| |
|norm|9 | | |
0|
|HTTP_NORMAL |T344_U20676_M0 | | |10.54.36.15 |04:59:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T345_U20679_M0 | | |10.54.36.30 |04:59:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T346_U20990_M0 | | |10.54.36.13 |05:09:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T347_U20664_M0 | | |10.54.36.26 |04:58:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T348_U20723_M0 | | |10.54.36.12 |05:01:01| |
|norm|4 | | |
0|
|SYNC_RFC |T349_U20669_M0 | | |smprd02.niladv.org |04:58:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T350_U20670_M0 | | |10.54.36.32 |04:58:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T351_U20671_M0 | | |10.54.36.27 |04:58:50| |
|norm|4 | | |
0|
|SYNC_RFC |T352_U20716_M0 | | |smprd02.niladv.org |05:00:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T353_U20694_M0 | | |10.54.36.33 |04:59:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T354_U20697_M0 | | |10.54.36.25 |05:00:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T355_U20698_M0 | | |10.54.36.36 |05:00:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T356_U20700_M0 | | |10.50.47.13 |05:00:12| |
|norm|3 | | |
0|
|SYNC_RFC |T357_U20702_M0 | | |smprd02.niladv.org |05:00:14| |
|norm|1 | | |
0|
|SYNC_RFC |T358_U20703_M0 | | |smprd02.niladv.org |05:00:16| |
|norm|1 | | |
0|
|SYNC_RFC |T359_U20704_M0 | | |smprd02.niladv.org |05:00:18| |
|norm|1 | | |
0|
|SYNC_RFC |T360_U20705_M0 | | |smprd02.niladv.org |05:00:20| |
|norm|1 | | |
0|
|SYNC_RFC |T361_U20707_M0 | | |smprd02.niladv.org |05:00:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T362_U20711_M0 | | |10.54.36.37 |05:00:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T363_U20712_M0 | | |10.54.36.14 |05:00:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T364_U20713_M0 | | |10.54.36.29 |05:00:38| |
|norm|12 | | |
0|
|SYNC_RFC |T365_U20714_M0 | | |smprd02.niladv.org |05:00:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T366_U20718_M0 | | |10.54.36.27 |05:00:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T367_U20720_M0 | | |10.50.47.10 |05:00:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T368_U20721_M0 | | |10.54.36.40 |05:00:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T369_U20722_M0 | | |10.54.36.34 |05:00:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T370_U20724_M0 | | |10.54.36.15 |05:01:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T371_U20725_M0 | | |10.54.36.17 |05:01:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T372_U20727_M0 | | |10.54.36.38 |05:01:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T373_U20738_M0 | | |10.54.36.35 |05:01:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T374_U20729_M0 | | |10.54.36.30 |05:01:04| |
|norm|3 | | |
0|
|SYNC_RFC |T375_U20731_M0 | | |smprd02.niladv.org |05:01:14| |
|norm|1 | | |
0|
|SYNC_RFC |T376_U20732_M0 | | |smprd02.niladv.org |05:01:16| |
|norm|1 | | |
0|
|SYNC_RFC |T377_U20733_M0 | | |smprd02.niladv.org |05:01:18| |
|norm|1 | | |
0|
|SYNC_RFC |T378_U20734_M0 | | |smprd02.niladv.org |05:01:20| |
|norm|1 | | |
0|
|SYNC_RFC |T379_U20736_M0 | | |smprd02.niladv.org |05:01:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T380_U20739_M0 | | |10.54.36.13 |05:01:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T381_U20740_M0 | | |10.54.36.26 |05:01:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T382_U20741_M0 | | |10.54.36.28 |05:01:32| |
|norm|6 | | |
0|
|HTTP_NORMAL |T383_U20742_M0 | | |10.54.36.11 |05:01:32| |
|norm|3 | | |
0|
|SYNC_RFC |T384_U20876_M0 | | |smprd02.niladv.org |05:05:50| |
|norm|1 | | |
0|
|SYNC_RFC |T385_U20745_M0 | | |smprd02.niladv.org |05:01:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T386_U20750_M0 | | |10.54.36.19 |05:01:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T387_U20751_M0 | | |10.54.36.33 |05:01:58| |
|norm|3 | | |
0|
|SYNC_RFC |T388_U20752_M0 | | |smprd02.niladv.org |05:02:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T389_U20754_M0 | | |10.54.36.25 |05:02:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T390_U20755_M0 | | |10.54.36.36 |05:02:04| |
|norm|3 | | |
0|
|SYNC_RFC |T391_U20758_M0 | | |smprd02.niladv.org |05:02:11| |
|norm|1 | | |
0|
|SYNC_RFC |T392_U20760_M0 | | |smprd02.niladv.org |05:02:14| |
|norm|1 | | |
0|
|SYNC_RFC |T393_U20761_M0 | | |smprd02.niladv.org |05:02:16| |
|norm|1 | | |
0|
|SYNC_RFC |T394_U20762_M0 | | |smprd02.niladv.org |05:02:18| |
|norm|1 | | |
0|
|SYNC_RFC |T395_U20763_M0 | | |smprd02.niladv.org |05:02:20| |
|norm|1 | | |
0|
|SYNC_RFC |T396_U20764_M0 | | |smprd02.niladv.org |05:02:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T397_U20769_M0 | | |10.54.36.37 |05:02:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T398_U20877_M0 | | |10.54.36.27 |05:05:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T399_U20774_M0 | | |10.54.36.29 |05:02:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T400_U20775_M0 | | |10.54.36.29 |05:02:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T401_U20776_M0 | | |10.54.36.29 |05:02:48| |
|norm|2 | | |
0|
|SYNC_RFC |T402_U20777_M0 | | |smprd02.niladv.org |05:02:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T403_U20778_M0 | | |10.54.36.27 |05:02:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T404_U20779_M0 | | |10.54.36.32 |05:02:50| |
|norm|11 | | |
0|
|HTTP_NORMAL |T405_U20781_M0 | | |10.50.47.10 |05:02:53| |
|norm|2 | | |
0|
|SYNC_RFC |T406_U20782_M0 | | |smprd02.niladv.org |05:02:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T407_U20783_M0 | | |10.54.36.34 |05:02:58| |
|norm|2 | | |
0|
|SYNC_RFC |T408_U20784_M0 | | |smprd02.niladv.org |05:03:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T409_U20785_M0 | | |10.54.36.12 |05:03:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T410_U20786_M0 | | |10.54.36.17 |05:03:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T411_U20788_M0 | | |10.54.36.15 |05:03:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T412_U20789_M0 | | |10.54.36.38 |05:03:04| |
|norm|5 | | |
0|
|HTTP_NORMAL |T413_U20790_M0 | | |10.54.36.30 |05:03:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T414_U20793_M0 | | |10.50.47.13 |05:03:11| |
|norm|4 | | |
0|
|SYNC_RFC |T415_U20795_M0 | | |smprd02.niladv.org |05:03:14| |
|norm|1 | | |
0|
|SYNC_RFC |T416_U20796_M0 | | |smprd02.niladv.org |05:03:16| |
|norm|1 | | |
0|
|SYNC_RFC |T417_U20797_M0 | | |smprd02.niladv.org |05:03:18| |
|norm|1 | | |
0|
|SYNC_RFC |T418_U20798_M0 | | |smprd02.niladv.org |05:03:20| |
|norm|1 | | |
0|
|SYNC_RFC |T419_U20799_M0 | | |smprd02.niladv.org |05:03:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T420_U20803_M0 | | |10.54.36.26 |05:03:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T421_U20804_M0 | | |10.54.36.28 |05:03:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T422_U20805_M0 | | |10.54.36.11 |05:03:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T423_U20807_M0 | | |10.54.36.41 |05:03:39| |
|norm|2 | | |
0|
|SYNC_RFC |T424_U20810_M0 | | |smprd02.niladv.org |05:03:50| |
|norm|1 | | |
0|
|SYNC_RFC |T425_U20812_M0 | | |smprd02.niladv.org |05:03:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T426_U20813_M0 | | |10.54.36.19 |05:03:58| |
|norm|9 | | |
0|
|HTTP_NORMAL |T427_U20814_M0 | | |10.54.36.33 |05:03:58| |
|norm|2 | | |
0|
|SYNC_RFC |T428_U20844_M0 | | |smprd02.niladv.org |05:04:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T429_U20817_M0 | | |10.54.36.25 |05:04:03| |
|norm|3 | | |
0|
|SYNC_RFC |T430_U20818_M0 | | |smprd02.niladv.org |05:04:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T431_U20819_M0 | | |10.54.36.36 |05:04:04| |
|norm|3 | | |
0|
|SYNC_RFC |T432_U20821_M0 | | |smprd02.niladv.org |05:04:10| |
|norm|1 | | |
0|
|SYNC_RFC |T433_U20823_M0 | | |smprd02.niladv.org |05:04:14| |
|norm|1 | | |
0|
|SYNC_RFC |T434_U20824_M0 | | |smprd02.niladv.org |05:04:16| |
|norm|1 | | |
0|
|SYNC_RFC |T435_U20825_M0 | | |smprd02.niladv.org |05:04:18| |
|norm|1 | | |
0|
|SYNC_RFC |T436_U20827_M0 | | |smprd02.niladv.org |05:04:20| |
|norm|1 | | |
0|
|SYNC_RFC |T437_U20828_M0 | | |smprd02.niladv.org |05:04:22| |
|norm|1 | | |
0|
|SYNC_RFC |T438_U20961_M0 | | |smprd02.niladv.org |05:08:46| |
|norm|1 | | |
0|
|SYNC_RFC |T439_U20856_M0 | | |smprd02.niladv.org |05:05:10| |
|norm|1 | | |
0|
|SYNC_RFC |T440_U20854_M0 | | |smprd02.niladv.org |05:05:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T441_U20847_M0 | | |10.50.47.10 |05:04:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T442_U20851_M0 | | |10.54.36.17 |05:05:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T443_U20850_M0 | | |10.54.36.12 |05:05:01| |
|norm|2 | | |
0|
|HTTP_NORMAL |T444_U20853_M0 | | |10.54.36.15 |05:05:03| |
|norm|4 | | |
0|
|SYNC_RFC |T445_U20936_M0 | | |smprd02.niladv.org |05:07:45| |
|norm|1 | | |
0|
|HTTP_NORMAL |T446_U20840_M0 | | |10.54.36.37 |05:04:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T447_U20910_M0 | | |10.54.36.12 |05:07:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T448_U20845_M0 | | |10.54.36.29 |05:04:49| |
|norm|2 | | |
0|
|SYNC_RFC |T449_U20859_M0 | | |smprd02.niladv.org |05:05:14| |
|norm|1 | | |
0|
|SYNC_RFC |T450_U20860_M0 | | |smprd02.niladv.org |05:05:16| |
|norm|1 | | |
0|
|SYNC_RFC |T451_U20861_M0 | | |smprd02.niladv.org |05:05:18| |
|norm|1 | | |
0|
|SYNC_RFC |T452_U20863_M0 | | |smprd02.niladv.org |05:05:20| |
|norm|1 | | |
0|
|SYNC_RFC |T453_U20864_M0 | | |smprd02.niladv.org |05:05:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T454_U20867_M0 | | |10.54.36.35 |05:05:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T455_U20868_M0 | | |10.54.36.13 |05:05:31| |
|norm|5 | | |
0|
|HTTP_NORMAL |T456_U20869_M0 | | |10.54.36.26 |05:05:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T457_U20870_M0 | | |10.54.36.28 |05:05:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T458_U20872_M0 | | |10.54.36.29 |05:05:38| |
|norm|10 | | |
0|
|HTTP_NORMAL |T459_U20873_M0 | | |10.54.36.41 |05:05:40| |
|norm|2 | | |
0|
|SYNC_RFC |T460_U20874_M0 | | |smprd02.niladv.org |05:05:40| |
|norm|1 | | |
0|
|HTTP_NORMAL |T461_U20878_M0 | | |10.54.36.32 |05:05:51| |
|norm|17 | | |
0|
|HTTP_NORMAL |T462_U20880_M0 | | |10.54.36.19 |05:05:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T463_U20881_M0 | | |10.54.36.33 |05:05:59| |
|norm|2 | | |
0|
|HTTP_NORMAL |T464_U20911_M0 | | |10.54.36.17 |05:07:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T465_U20884_M0 | | |10.54.36.30 |05:06:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T466_U20886_M0 | | |10.54.36.38 |05:06:03| |
|norm|5 | | |
0|
|SYNC_RFC |T467_U20887_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T468_U20888_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T469_U20890_M0 | | |smprd02.niladv.org |05:06:14| |
|norm|1 | | |
0|
|SYNC_RFC |T470_U20891_M0 | | |smprd02.niladv.org |05:06:15| |
|norm|1 | | |
0|
|SYNC_RFC |T471_U20892_M0 | | |smprd02.niladv.org |05:06:17| |
|norm|1 | | |
0|
|SYNC_RFC |T472_U20893_M0 | | |smprd02.niladv.org |05:06:18| |
|norm|1 | | |
0|
|SYNC_RFC |T473_U20896_M0 | | |smprd02.niladv.org |05:06:20| |
|norm|1 | | |
0|
|SYNC_RFC |T474_U20897_M0 | | |smprd02.niladv.org |05:06:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T475_U20899_M0 | | |10.54.36.11 |05:06:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T476_U20901_M0 | | |10.54.36.37 |05:06:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T477_U20902_M0 | | |10.54.36.14 |05:06:37| |
|norm|2 | | |
0|
|SYNC_RFC |T478_U20903_M0 | | |smprd02.niladv.org |05:06:40| |
|norm|1 | | |
0|
|SYNC_RFC |T479_U20908_M0 | | |smprd02.niladv.org |05:06:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T480_U20909_M0 | | |10.54.36.34 |05:06:59| |
|norm|2 | | |
0|
|HTTP_NORMAL |T481_U20912_M0 | | |10.54.36.25 |05:07:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T482_U20914_M0 | | |10.54.36.36 |05:07:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T483_U20929_M0 | | |10.54.36.35 |05:07:29| |
|norm|2 | | |
0|
|SYNC_RFC |T484_U20916_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T485_U20917_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T486_U20918_M0 | | |smprd02.niladv.org |05:07:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T487_U20919_M0 | | |10.50.47.13 |05:07:12| |
|norm|2 | | |
0|
|SYNC_RFC |T488_U20921_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T489_U20922_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T490_U20923_M0 | | |smprd02.niladv.org |05:07:17| |
|norm|1 | | |
0|
|SYNC_RFC |T491_U20924_M0 | | |smprd02.niladv.org |05:07:19| |
|norm|1 | | |
0|
|SYNC_RFC |T492_U20926_M0 | | |smprd02.niladv.org |05:07:21| |
|norm|1 | | |
0|
|SYNC_RFC |T493_U20927_M0 | | |smprd02.niladv.org |05:07:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T494_U20930_M0 | | |10.54.36.13 |05:07:31| |
|norm|3 | | |
0|
|HTTP_NORMAL |T495_U20931_M0 | | |10.54.36.26 |05:07:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T496_U20933_M0 | | |10.54.36.28 |05:07:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T497_U20937_M0 | | |10.54.36.29 |05:07:48| |
|norm|9 | | |
0|
|HTTP_NORMAL |T498_U20938_M0 | | |10.54.36.29 |05:07:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T499_U20939_M0 | | |10.54.36.29 |05:07:49| |
|norm|2 | | |
0|
|SYNC_RFC |T500_U20941_M0 | | |smprd02.niladv.org |05:07:59| |
|norm|1 | | |
0|
|HTTP_NORMAL |T501_U20942_M0 | | |10.54.36.15 |05:08:01| |
|norm|4 | | |
0|
|SYNC_RFC |T502_U20946_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T503_U20947_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T504_U20949_M0 | | |smprd02.niladv.org |05:08:15| |
|norm|1 | | |
0|
|SYNC_RFC |T505_U20950_M0 | | |smprd02.niladv.org |05:08:17| |
|norm|1 | | |
0|
|SYNC_RFC |T506_U20951_M0 | | |smprd02.niladv.org |05:08:19| |
|norm|1 | | |
0|
|SYNC_RFC |T507_U20952_M0 | | |smprd02.niladv.org |05:08:21| |
|norm|1 | | |
0|
|SYNC_RFC |T508_U20953_M0 | | |smprd02.niladv.org |05:08:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T509_U20958_M0 | | |10.54.36.41 |05:08:40| |
|norm|2 | | |
0|
|HTTP_NORMAL |T510_U20962_M0 | | |10.54.36.32 |05:08:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T511_U20963_M0 | | |10.54.36.27 |05:08:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T512_U20965_M0 | | |10.54.36.33 |05:08:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T513_U20966_M0 | | |10.54.36.19 |05:08:58| |
|norm|19 | | |
0|
|SYNC_RFC |T514_U20967_M0 | | |smprd02.niladv.org |05:09:00| |
|norm|1 | | |
0|
|HTTP_NORMAL |T515_U20968_M0 | | |10.54.36.12 |05:09:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T516_U20969_M0 | | |10.54.36.30 |05:09:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T517_U20971_M0 | | |10.54.36.38 |05:09:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T518_U20974_M0 | | |10.54.36.17 |05:09:05| |
|norm|2 | | |
0|
|SYNC_RFC |T519_U20975_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T520_U20976_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T521_U20977_M0 | | |smprd02.niladv.org |05:09:08| |
|norm|1 | | |
0|
|SYNC_RFC |T522_U20997_M0 | | |smprd02.niladv.org |05:09:46| |
|norm|1 | | |
0|
|HTTP_NORMAL |T523_U20998_M0 | | |10.54.36.29 |05:09:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T524_U21071_M0 | | |10.50.47.10 |05:11:53| |
|norm|1 | | |
0|
|SYNC_RFC |T525_U21069_M0 | | |smprd02.niladv.org |05:11:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T526_U20982_M0 | | |10.50.47.13 |05:09:12| |
|norm|3 | | |
0|
|SYNC_RFC |T527_U20984_M0 | | |smprd02.niladv.org |05:09:15| |
|norm|1 | | |
0|
|SYNC_RFC |T528_U20985_M0 | | |smprd02.niladv.org |05:09:17| |
|norm|1 | | |
0|
|SYNC_RFC |T529_U20986_M0 | | |smprd02.niladv.org |05:09:19| |
|norm|1 | | |
0|
|SYNC_RFC |T530_U20987_M0 | | |smprd02.niladv.org |05:09:21| |
|norm|1 | | |
0|
|SYNC_RFC |T531_U20988_M0 | | |smprd02.niladv.org |05:09:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T532_U20992_M0 | | |10.54.36.37 |05:09:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T533_U20994_M0 | | |10.54.36.26 |05:09:33| |
|norm|2 | | |
0|
|INTERNAL |T534_U21091_M0 |000|SAPSYS | |05:12:34|4 |
|high| | | |
4200|
|HTTP_NORMAL |T535_U21000_M0 | | |10.50.47.10 |05:09:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T536_U21001_M0 | | |10.54.36.34 |05:09:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T537_U21002_M0 | | |10.54.36.25 |05:10:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T538_U21004_M0 | | |10.54.36.36 |05:10:04| |
|norm|2 | | |
0|
|SYNC_RFC |T539_U21007_M0 | | |smprd02.niladv.org |05:10:06| |
|norm|1 | | |
0|
|SYNC_RFC |T540_U21008_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T541_U21009_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T542_U21010_M0 | | |smprd02.niladv.org |05:10:09| |
|norm|1 | | |
0|
|SYNC_RFC |T543_U21012_M0 | | |smprd02.niladv.org |05:10:15| |
|norm|1 | | |
0|
|SYNC_RFC |T544_U21013_M0 | | |smprd02.niladv.org |05:10:17| |
|norm|1 | | |
0|
|SYNC_RFC |T545_U21014_M0 | | |smprd02.niladv.org |05:10:19| |
|norm|1 | | |
0|
|SYNC_RFC |T546_U21015_M0 | | |smprd02.niladv.org |05:10:21| |
|norm|1 | | |
0|
|SYNC_RFC |T547_U21016_M0 | | |smprd02.niladv.org |05:10:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T548_U21020_M0 | | |10.54.36.35 |05:10:27| |
|norm|2 | | |
0|
|HTTP_NORMAL |T549_U21021_M0 | | |10.54.36.28 |05:12:40|3 |
|norm|1 | | |
0|
|SYNC_RFC |T550_U21049_M0 | | |smprd02.niladv.org |05:11:08| |
|norm|1 | | |
0|
|HTTP_NORMAL |T551_U21047_M0 | | |10.54.36.38 |05:11:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T552_U21048_M0 | | |10.54.36.17 |05:11:05| |
|norm|1 | | |
0|
|HTTP_NORMAL |T553_U21041_M0 | | |10.54.36.15 |05:11:00| |
|norm|1 | | |
0|
|HTTP_NORMAL |T554_U21040_M0 | | |10.54.36.33 |05:10:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T555_U21042_M0 | | |10.54.36.12 |05:11:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T556_U21043_M0 | | |10.54.36.30 |05:11:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T557_U21039_M0 | | |10.54.36.40 |05:10:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T558_U21031_M0 | | |10.54.36.29 |05:10:38| |
|norm|9 | | |
0|
|SYNC_RFC |T559_U21035_M0 | | |smprd02.niladv.org |05:10:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T560_U21036_M0 | | |10.54.36.32 |05:10:50| |
|norm|8 | | |
0|
|HTTP_NORMAL |T561_U21037_M0 | | |10.54.36.27 |05:10:50| |
|norm|2 | | |
0|
|SYNC_RFC |T562_U21050_M0 | | |smprd02.niladv.org |05:11:09| |
|norm|1 | | |
0|
|HTTP_NORMAL |T563_U21051_M0 | | |10.50.47.13 |05:11:12| |
|norm|1 | | |
0|
|SYNC_RFC |T564_U21053_M0 | | |smprd02.niladv.org |05:11:15| |
|norm|1 | | |
0|
|SYNC_RFC |T565_U21054_M0 | | |smprd02.niladv.org |05:11:17| |
|norm|1 | | |
0|
|SYNC_RFC |T566_U21055_M0 | | |smprd02.niladv.org |05:11:19| |
|norm|1 | | |
0|
|SYNC_RFC |T567_U21056_M0 | | |smprd02.niladv.org |05:11:21| |
|norm|1 | | |
0|
|SYNC_RFC |T568_U21057_M0 | | |smprd02.niladv.org |05:11:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T569_U21061_M0 | | |10.54.36.13 |05:11:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T570_U21062_M0 | | |10.54.36.11 |05:11:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T571_U21063_M0 | | |10.54.36.37 |05:11:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T572_U21065_M0 | | |10.54.36.26 |05:11:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T573_U21066_M0 | | |10.54.36.14 |05:11:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T574_U21067_M0 | | |10.54.36.41 |05:11:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T575_U21072_M0 | | |10.54.36.19 |05:11:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T576_U21073_M0 | | |10.54.36.34 |05:11:58| |
|norm|1 | | |
0|
|SYNC_RFC |T578_U21075_M0 | | |smprd02.niladv.org |05:12:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T579_U21076_M0 | | |10.54.36.25 |05:12:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T580_U21079_M0 | | |10.54.36.36 |05:12:05| |
|norm|1 | | |
0|
|SYNC_RFC |T581_U21080_M0 | | |smprd02.niladv.org |05:12:08| |
|norm|1 | | |
0|
|SYNC_RFC |T582_U21081_M0 | | |smprd02.niladv.org |05:12:10| |
|norm|1 | | |
0|
|SYNC_RFC |T583_U21082_M0 | | |smprd02.niladv.org |05:12:11| |
|norm|1 | | |
0|
|SYNC_RFC |T584_U21084_M0 | | |smprd02.niladv.org |05:12:15| |
|norm|1 | | |
0|
|SYNC_RFC |T585_U21085_M0 | | |smprd02.niladv.org |05:12:17| |
|norm|1 | | |
0|
|SYNC_RFC |T586_U21086_M0 | | |smprd02.niladv.org |05:12:19| |
|norm|1 | | |
0|
|SYNC_RFC |T587_U21088_M0 | | |smprd02.niladv.org |05:12:21| |
|norm|1 | | |
0|
|SYNC_RFC |T588_U21089_M0 | | |smprd02.niladv.org |05:12:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T589_U21092_M0 | | |10.54.36.35 |05:12:28| |
|norm|1 | | |
0|
|HTTP_NORMAL |T590_U21094_M0 | | |10.54.36.28 |05:12:34| |
|norm|1 | | |
0|

Found 590 logons with 590 sessions


Total ES (gross) memory of all sessions: 79 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

Force ABAP stack dump of session T12_U20220_M0


Force ABAP stack dump of session T23_U20175_M0
Force ABAP stack dump of session T38_U20216_M0
Force ABAP stack dump of session T61_U20113_M0

Sun Sep 22 05:12:43:873 2019


Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T121_U20226_M0
Force ABAP stack dump of session T190_U20350_M0
Force ABAP stack dump of session T197_U20347_M0
Force ABAP stack dump of session T200_U20354_M0
Force ABAP stack dump of session T251_U20559_M0
Force ABAP stack dump of session T253_U20470_M0
Force ABAP stack dump of session T255_U20475_M0
Force ABAP stack dump of session T260_U20481_M0
Force ABAP stack dump of session T288_U20578_M0
Force ABAP stack dump of session T313_U20596_M0
Force ABAP stack dump of session T315_U20602_M0
Force ABAP stack dump of session T323_U20621_M0
Force ABAP stack dump of session T342_U20674_M0
Force ABAP stack dump of session T364_U20713_M0
Force ABAP stack dump of session T404_U20779_M0
Force ABAP stack dump of session T458_U20872_M0
Force ABAP stack dump of session T461_U20878_M0
Force ABAP stack dump of session T513_U20966_M0

RFC-Connection Table (187 entries) Sun Sep 22 05:12:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 4|53822060|53822060SU20926_M0 |T492_U20926_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 5|53882124|53882124SU20949_M0 |T504_U20949_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 11|53813726|53813726SU20922_M0 |T489_U20922_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 12|53370125|53370125SU20716_M0 |T352_U20716_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 13|53891128|53891128SU20952_M0 |T507_U20952_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 16|53750985|53750985SU20896_M0 |T473_U20896_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 20|53819053|53819053SU20924_M0 |T491_U20924_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 21|53677905|53677905SU20861_M0 |T451_U20861_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 23|53812054|53812054SU20921_M0 |T488_U20921_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 25|53941255|53941255SU20976_M0 |T520_U20976_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 26|54305161|54305161SU20692_M0 |T334_U20692_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 28|53871130|53871130SU20946_M0 |T502_U20946_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 31|53533790|53533790SU20795_M0 |T415_U20795_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 32|53831465|53831465SU20509_M0 |T274_U20509_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 33|54303059|54303059SU20690_M0 |T341_U20690_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 34|53477705|53477705SU20764_M0 |T396_U20764_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 35|53807465|53807465SU20918_M0 |T486_U20918_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 37|53606863|53606863SU20824_M0 |T434_U20824_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 38|53461465|53461465SU20758_M0 |T391_U20758_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 42|53536792|53536792SU20796_M0 |T416_U20796_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 43|53659094|53659094SU20854_M0 |T440_U20854_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 44|54184460|54184460SU20641_M0 |T331_U20641_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 45|53745004|53745004SU20892_M0 |T471_U20892_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 48|54172368|54172368SU21089_M0 |T588_U21089_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 50|53747978|53747978SU20893_M0 |T472_U20893_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 53|53816065|53816065SU20923_M0 |T490_U20923_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 56|53666302|53666302SU20856_M0 |T439_U20856_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 57|53869968|53869968SU20533_M0 |T290_U20533_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 60|53799525|53799525SU20916_M0 |T484_U20916_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 67|54240969|54240969SU20669_M0 |T349_U20669_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 68|53474712|53474712SU20763_M0 |T395_U20763_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 69|53609831|53609831SU20825_M0 |T435_U20825_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 71|53603861|53603861SU20823_M0 |T433_U20823_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 72|53504970|53504970SU20777_M0 |T402_U20777_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 73|53627349|53627349SU20435_M0 |T238_U20435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 76|53740983|53740983SU20890_M0 |T469_U20890_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 81|53539769|53539769SU20797_M0 |T417_U20797_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 82|53542786|53542786SU20798_M0 |T418_U20798_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 83|54031268|54031268SU21015_M0 |T546_U21015_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 84|54055969|54055969SU20600_M0 |T277_U20600_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 85|53753961|53753961SU20897_M0 |T474_U20897_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 90|53335572|53335572SU20704_M0 |T359_U20704_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 92|53952146|53952146SU20984_M0 |T527_U20984_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 93|53933395|53933395SU20967_M0 |T514_U20967_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 95|53964177|53964177SU20988_M0 |T531_U20988_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 96|53683884|53683884SU20864_M0 |T453_U20864_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 98|32252616|32252616SU20586_M0 |T308_U20586_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 99|53683971|53683971SU20456_M0 |T248_U20456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 100|53884618|53884618SU20542_M0 |T279_U20542_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 101|53875871|53875871SU20535_M0 |T281_U20535_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 103|53955198|53955198SU20985_M0 |T528_U20985_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 104|53961202|53961202SU20987_M0 |T530_U20987_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 106|53713330|53713330SU20876_M0 |T384_U20876_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 108|54009269|54009269SU21007_M0 |T539_U21007_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 109|53791976|53791976SU20908_M0 |T479_U20908_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 113|53825033|53825033SU20927_M0 |T493_U20927_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 115|53872183|53872183SU20947_M0 |T503_U20947_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 118|53918034|53918034SU20961_M0 |T438_U20961_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 121|53450513|53450513SU20752_M0 |T388_U20752_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 124|54169405|54169405SU21088_M0 |T587_U21088_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 126|54246530|54246530SU20673_M0 |T339_U20673_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 130|53730608|53730608SU20888_M0 |T468_U20888_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 131|53863051|53863051SU20941_M0 |T500_U20941_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 133|54022221|54022221SU21012_M0 |T543_U21012_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 135|53772320|53772320SU20903_M0 |T478_U20903_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 136|54013979|54013979SU21009_M0 |T541_U21009_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 138|53820547|53820547SU20507_M0 |T273_U20507_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 142|53888093|53888093SU20951_M0 |T506_U20951_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 143|53958159|53958159SU20986_M0 |T529_U20986_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 145|53615810|53615810SU20828_M0 |T437_U20828_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 149|53338570|53338570SU20705_M0 |T360_U20705_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 153|53674932|53674932SU20860_M0 |T450_U20860_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 154|54028234|54028234SU21014_M0 |T545_U21014_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 157|54093287|54093287SU21054_M0 |T565_U21054_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 158|53573044|53573044SU20809_M0 |T293_U20809_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 160|53405636|53405636SU20734_M0 |T378_U20734_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 162|54160337|54160337SU21084_M0 |T584_U21084_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 163|54083666|54083666SU21050_M0 |T562_U21050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 165|53591025|53591025SU20818_M0 |T430_U20818_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 166|53940202|53940202SU20975_M0 |T519_U20975_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 169|53671936|53671936SU20859_M0 |T449_U20859_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 170|53796298|53796298SU20500_M0 |T271_U20500_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 171|53575295|53575295SU20810_M0 |T424_U20810_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 174|54015616|54015616SU21010_M0 |T542_U21010_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 175|53402644|53402644SU20733_M0 |T377_U20733_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 176|54333571|54333571SU20703_M0 |T358_U20703_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 177|54330571|54330571SU20702_M0 |T357_U20702_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 178|53359275|53359275SU20714_M0 |T365_U20714_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 180|53426328|53426328SU20745_M0 |T385_U20745_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 182|54081559|54081559SU21049_M0 |T550_U21049_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 183|54166366|54166366SU21086_M0 |T586_U21086_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 184|54025224|54025224SU21013_M0 |T544_U21013_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 185|54150626|54150626SU21080_M0 |T581_U21080_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 186|54128970|54128970SU21069_M0 |T525_U21069_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 187|53341583|53341583SU20707_M0 |T361_U20707_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 190|53894111|53894111SU20953_M0 |T508_U20953_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 193|53933040|53933040SU20558_M0 |T75_U20558_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 194|53944437|53944437SU20977_M0 |T521_U20977_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 195|53702262|53702262SU20874_M0 |T460_U20874_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 196|54090292|54090292SU21053_M0 |T564_U21053_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 199|53643123|53643123SU20844_M0 |T428_U20844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 200|53612837|53612837SU20827_M0 |T436_U20827_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 201|53399648|53399648SU20732_M0 |T376_U20732_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 204|53438225|53438225SU20748_M0 |T256_U20748_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 205|53746045|53746045SU20474_M0 |T254_U20474_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 207|54096290|54096290SU21055_M0 |T566_U21055_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 208|54034234|54034234SU21016_M0 |T547_U21016_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 209|53580137|53580137SU20812_M0 |T425_U20812_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 212|53519586|53519586SU20784_M0 |T408_U20784_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 213|53545747|53545747SU20799_M0 |T419_U20799_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 216|54141465|54141465SU20630_M0 |T328_U20630_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 220|53885142|53885142SU20950_M0 |T505_U20950_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 221|54118046|54118046SU20620_M0 |T322_U20620_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 225|53848955|53848955SU20936_M0 |T445_U20936_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 227|54012478|54012478SU21008_M0 |T540_U21008_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 234|54107318|54107318SU20615_M0 |T320_U20615_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 235|54099331|54099331SU21056_M0 |T567_U21056_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 236|53598228|53598228SU20821_M0 |T432_U20821_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 238|53988107|53988107SU20997_M0 |T522_U20997_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 239|53468725|53468725SU20761_M0 |T393_U20761_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 242|53742654|53742654SU20891_M0 |T470_U20891_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 244|54142537|54142537SU21075_M0 |T578_U21075_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 250|53800681|53800681SU20917_M0 |T485_U20917_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 251|54155465|54155465SU21082_M0 |T583_U21082_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 252|53938941|53938941SU20562_M0 |T298_U20562_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 253|53408645|53408645SU20736_M0 |T379_U20736_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 254|53680911|53680911SU20863_M0 |T452_U20863_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 255|53734228|53734228SU20469_M0 |T252_U20469_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 257|54163344|54163344SU21085_M0 |T585_U21085_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 259|53465735|53465735SU20760_M0 |T392_U20760_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 263|54045247|54045247SU20597_M0 |T314_U20597_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 265|53471695|53471695SU20762_M0 |T394_U20762_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 267|54153466|54153466SU21081_M0 |T582_U21081_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 271|53396676|53396676SU20731_M0 |T375_U20731_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 274|54102300|54102300SU21057_M0 |T568_U21057_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 277|54062174|54062174SU21035_M0 |T559_U21035_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 282|53511057|53511057SU20782_M0 |T406_U20782_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 284|53729452|53729452SU20887_M0 |T467_U20887_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 187 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
361 INVALID -1
362 INVALID -1
363 INVALID -1
364 INVALID -1
365 INVALID -1
366 INVALID -1
367 INVALID -1
368 INVALID -1
369 INVALID -1
370 INVALID -1
371 INVALID -1
372 INVALID -1
373 INVALID -1
374 INVALID -1
375 INVALID -1
376 INVALID -1
377 INVALID -1
378 INVALID -1
379 INVALID -1
380 INVALID -1
381 INVALID -1
382 INVALID -1
383 INVALID -1
384 INVALID -1
385 INVALID -1
386 INVALID -1
387 INVALID -1
388 INVALID -1
389 INVALID -1
390 INVALID -1
391 INVALID -1
392 INVALID -1
393 INVALID -1
394 INVALID -1
395 INVALID -1
396 INVALID -1
397 INVALID -1
398 INVALID -1
399 INVALID -1
400 INVALID -1
401 INVALID -1
402 INVALID -1
403 INVALID -1
404 INVALID -1
405 INVALID -1
406 INVALID -1
407 INVALID -1
408 INVALID -1
409 INVALID -1
410 INVALID -1
411 INVALID -1
412 INVALID -1
413 INVALID -1
414 INVALID -1
415 INVALID -1
416 INVALID -1
417 INVALID -1
418 INVALID -1
419 INVALID -1
420 INVALID -1
421 INVALID -1
422 INVALID -1
423 INVALID -1
424 INVALID -1
425 INVALID -1
426 INVALID -1
427 INVALID -1
428 INVALID -1
429 INVALID -1
430 INVALID -1
... skip next entries
100 ca_blk slots of 6000 in use, 98 currently unowned (in request queues)
MPI Info Sun Sep 22 05:12:43 2019
------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 228 / 1884

Periodic Tasks Sun Sep 22 05:12:43 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5715| 77| |
|
| 1|DDLOG | 5715| 77| |
|
| 2|BTCSCHED | 11427| 21| |
|
| 3|RESTART_ALL | 2286| 133| |
|
| 4|ENVCHECK | 34297| 20| |
|
| 5|AUTOABAP | 2286| 133| |
|
| 6|BGRFC_WATCHDOG | 2287| 133| |
|
| 7|AUTOTH | 311| 21| |
|
| 8|AUTOCCMS | 11427| 21| |
|
| 9|AUTOSECURITY | 11426| 21| |
|
| 10|LOAD_CALCULATION | 684994| 1| |
|
| 11|SPOOLALRM | 11432| 21| |
|
| 12|CALL_DELAYED | 0| 161| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 186 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:12:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:12:53:858 2019


DpHdlSoftCancel: cancel request for T560_U21036_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T561_U21037_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:12:59:864 2019


DpHdlSoftCancel: cancel request for T554_U21040_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T557_U21039_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:13:03:848 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:13:04:157 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:13:05:811 2019


DpHdlSoftCancel: cancel request for T592_U21099_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T551_U21047_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T552_U21048_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T553_U21041_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T555_U21042_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T556_U21043_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:13:15:821 2019


DpHdlSoftCancel: cancel request for T563_U21051_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:13:23:849 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:13:24:175 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
Sun Sep 22 05:13:34:834 2019
DpHdlSoftCancel: cancel request for T569_U21061_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T570_U21062_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T571_U21063_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T572_U21065_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:13:39:840 2019


DpHdlSoftCancel: cancel request for T591_U21098_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T574_U21067_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T573_U21066_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:13:43:849 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 30256

Sun Sep 22 05:13:54:195 2019


DpHdlSoftCancel: cancel request for T524_U21071_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:13:59:200 2019


DpHdlSoftCancel: cancel request for T575_U21072_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T576_U21073_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:14:03:850 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
Sun Sep 22 05:14:04:157 2019
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
DpHdlSoftCancel: cancel request for T579_U21076_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:14:09:209 2019


DpHdlSoftCancel: cancel request for T580_U21079_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:14:22:606 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:14:23:851 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 30256 terminated

Sun Sep 22 05:14:24:547 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:14:29:219 2019


DpHdlSoftCancel: cancel request for T589_U21092_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:14:39:226 2019


DpHdlSoftCancel: cancel request for T590_U21094_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:14:43:851 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:14:44:563 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:14:54:992 2019


DpHdlSoftCancel: cancel request for T577_U21097_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:14:59:992 2019


DpHdlSoftCancel: cancel request for T594_U21102_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:03:852 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-31048
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-31049
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-31050
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-31051
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-31052

Sun Sep 22 05:15:04:159 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:15:04:280 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:15:04:889 2019


*** ERROR => DpHdlDeadWp: W0 (pid 31048) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31048) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 31048)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 31049) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31049) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 31049)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 31050) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31050) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 31050)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 31051) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31051) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 31051)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 31052) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=31052) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 31052)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:15:04:997 2019


DpHdlSoftCancel: cancel request for T596_U21105_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T595_U21104_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T598_U21108_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:10:002 2019


DpHdlSoftCancel: cancel request for T599_U21109_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:15:005 2019


DpHdlSoftCancel: cancel request for T601_U21111_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:23:853 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
Sun Sep 22 05:15:24:298 2019
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:15:34:018 2019


DpHdlSoftCancel: cancel request for T597_U21120_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T607_U21121_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T608_U21122_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T609_U21124_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:39:023 2019


DpHdlSoftCancel: cancel request for T610_U21125_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T558_U21031_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:43:853 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:15:44:026 2019


DpHdlSoftCancel: cancel request for T611_U21126_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:54:322 2019


DpHdlSoftCancel: cancel request for T612_U21130_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T613_U21131_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:15:59:327 2019


DpHdlSoftCancel: cancel request for T616_U21135_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:16:03:854 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:16:04:159 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:16:04:330 2019


DpHdlSoftCancel: cancel request for T617_U21136_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T618_U21138_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:16:23:854 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:16:24:351 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:16:29:344 2019


DpHdlSoftCancel: cancel request for T626_U21152_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:16:43:855 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
Sun Sep 22 05:16:44:368 2019
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:16:52:981 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.916257 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.915940 /
0.000000

Sun Sep 22 05:16:54:111 2019


DpHdlSoftCancel: cancel request for T627_U21157_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T628_U21159_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:16:57:985 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 05:17:03:855 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:17:04:160 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:17:04:267 2019


DpHdlSoftCancel: cancel request for T630_U21161_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T631_U21162_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:17:18:355 2019


DpHdlSoftCancel: cancel request for T633_U21167_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:17:23:855 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:17:24:404 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:17:34:367 2019


DpHdlSoftCancel: cancel request for T422_U21176_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:17:43:856 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:17:44:374 2019


DpHdlSoftCancel: cancel request for T640_U21179_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:17:54:382 2019


DpHdlSoftCancel: cancel request for T643_U21185_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T642_U21184_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:17:59:387 2019


DpHdlSoftCancel: cancel request for T645_U21188_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:18:03:857 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:18:04:161 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:18:04:392 2019


DpHdlSoftCancel: cancel request for T646_U21189_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T647_U21190_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T648_U21191_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T649_U21192_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T650_U21193_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:18:09:734 2019


DpHdlSoftCancel: cancel request for T683_U21257_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T651_U21197_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:18:23:857 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:13:43 2019, skip new
snapshot

Sun Sep 22 05:18:24:459 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:18:34:758 2019


DpHdlSoftCancel: cancel request for T659_U21209_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T660_U21210_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T662_U21212_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T663_U21213_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T661_U21211_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
Sun Sep 22 05:18:39:759 2019
DpHdlSoftCancel: cancel request for T682_U21256_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T672_U21223_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:18:43:858 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 187 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:18:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 05:18:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 1
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 05:18:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 2098 (peak 2098, writeCount 24105943, readCount 24103845)


UPD : 0 (peak 31, writeCount 4961, readCount 4961)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125413, readCount 2125413)
SPO : 0 (peak 2, writeCount 25122, readCount 25122)
UP2 : 0 (peak 1, writeCount 2347, readCount 2347)
DISP: 0 (peak 67, writeCount 890264, readCount 890264)
GW : 0 (peak 49, writeCount 22411023, readCount 22411023)
ICM : 0 (peak 186, writeCount 391115, readCount 391115)
LWP : 6 (peak 16, writeCount 38290, readCount 38284)

Session queue dump (high priority, 1 elements, peak 39):


-1 <- 26 < EmbryoQueue_DIA> -> -1
Session queue dump (normal priority, 690 elements, peak 690):
-1 <- 703 < T672_U21223_M0> -> 713
703 <- 713 < T682_U21256_M0> -> 692
713 <- 692 < T661_U21211_M0> -> 694
692 <- 694 < T663_U21213_M0> -> 693
694 <- 693 < T662_U21212_M0> -> 691
693 <- 691 < T660_U21210_M0> -> 690
691 <- 690 < T659_U21209_M0> -> 532
690 <- 532 < T501_U20942_M0> -> 682
532 <- 682 < T651_U21197_M0> -> 714
682 <- 714 < T683_U21257_M0> -> 681
714 <- 681 < T650_U21193_M0> -> 680
681 <- 680 < T649_U21192_M0> -> 679
680 <- 679 < T648_U21191_M0> -> 678
679 <- 678 < T647_U21190_M0> -> 677
678 <- 677 < T646_U21189_M0> -> 676
677 <- 676 < T645_U21188_M0> -> 528
676 <- 528 < T497_U20937_M0> -> 530
528 <- 530 < T499_U20939_M0> -> 529
530 <- 529 < T498_U20938_M0> -> 673
529 <- 673 < T642_U21184_M0> -> 674
673 <- 674 < T643_U21185_M0> -> 527
674 <- 527 < T496_U20933_M0> -> 526
527 <- 526 < T495_U20931_M0> -> 514
526 <- 514 < T483_U20929_M0> -> 525
514 <- 525 < T494_U20930_M0> -> 453
525 <- 453 < T422_U21176_M0> -> 518
453 <- 518 < T487_U20919_M0> -> 664
518 <- 664 < T633_U21167_M0> -> 478
664 <- 478 < T447_U20910_M0> -> 513
478 <- 513 < T482_U20914_M0> -> 512
513 <- 512 < T481_U20912_M0> -> 495
512 <- 495 < T464_U20911_M0> -> 331
495 <- 331 < T300_U20907_M0> -> 511
331 <- 511 < T480_U20909_M0> -> 662
511 <- 662 < T631_U21162_M0> -> 661
662 <- 661 < T630_U21161_M0> -> 659
661 <- 659 < T628_U21159_M0> -> 658
659 <- 658 < T627_U21157_M0> -> 508
658 <- 508 < T477_U20902_M0> -> 507
508 <- 507 < T476_U20901_M0> -> 506
507 <- 506 < T475_U20899_M0> -> 657
506 <- 657 < T626_U21152_M0> -> 497
657 <- 497 < T466_U20886_M0> -> 496
497 <- 496 < T465_U20884_M0> -> 494
496 <- 494 < T463_U20881_M0> -> 493
494 <- 493 < T462_U20880_M0> -> 649
493 <- 649 < T618_U21138_M0> -> 648
649 <- 648 < T617_U21136_M0> -> 647
648 <- 647 < T616_U21135_M0> -> 492
647 <- 492 < T461_U20878_M0> -> 429
492 <- 429 < T398_U20877_M0> -> 644
429 <- 644 < T613_U21131_M0> -> 643
644 <- 643 < T612_U21130_M0> -> 490
643 <- 490 < T459_U20873_M0> -> 489
490 <- 489 < T458_U20872_M0> -> 488
489 <- 488 < T457_U20870_M0> -> 589
488 <- 589 < T558_U21031_M0> -> 641
589 <- 641 < T610_U21125_M0> -> 487
641 <- 487 < T456_U20869_M0> -> 486
487 <- 486 < T455_U20868_M0> -> 485
486 <- 485 < T454_U20867_M0> -> 640
485 <- 640 < T609_U21124_M0> -> 639
640 <- 639 < T608_U21122_M0> -> 638
639 <- 638 < T607_U21121_M0> -> 628
638 <- 628 < T597_U21120_M0> -> 471
628 <- 471 < T444_U20853_M0> -> 474
471 <- 474 < T443_U20850_M0> -> 473
474 <- 473 < T442_U20851_M0> -> 630
473 <- 630 < T599_U21109_M0> -> 476
630 <- 476 < T441_U20847_M0> -> 629
476 <- 629 < T598_U21108_M0> -> 626
629 <- 626 < T595_U21104_M0> -> 627
626 <- 627 < T596_U21105_M0> -> 479
627 <- 479 < T448_U20845_M0> -> 625
479 <- 625 < T594_U21102_M0> -> 608
625 <- 608 < T577_U21097_M0> -> 477
608 <- 477 < T446_U20840_M0> -> 621
477 <- 621 < T590_U21094_M0> -> 620
621 <- 620 < T589_U21092_M0> -> 462
620 <- 462 < T431_U20819_M0> -> 460
462 <- 460 < T429_U20817_M0> -> 457
460 <- 457 < T426_U20813_M0> -> 458
457 <- 458 < T427_U20814_M0> -> 611
458 <- 611 < T580_U21079_M0> -> 610
611 <- 610 < T579_U21076_M0> -> 607
610 <- 607 < T576_U21073_M0> -> 606
607 <- 606 < T575_U21072_M0> -> 555
606 <- 555 < T524_U21071_M0> -> 454
555 <- 454 < T423_U20807_M0> -> 604
454 <- 604 < T573_U21066_M0> -> 605
604 <- 605 < T574_U21067_M0> -> 452
605 <- 452 < T421_U20804_M0> -> 451
452 <- 451 < T420_U20803_M0> -> 622
451 <- 622 < T591_U21098_M0> -> 301
622 <- 301 < T270_U20802_M0> -> 298
301 <- 298 < T267_U20801_M0> -> 603
298 <- 603 < T572_U21065_M0> -> 602
603 <- 602 < T571_U21063_M0> -> 601
602 <- 601 < T570_U21062_M0> -> 600
601 <- 600 < T569_U21061_M0> -> 445
600 <- 445 < T414_U20793_M0> -> 444
445 <- 444 < T413_U20790_M0> -> 443
444 <- 443 < T412_U20789_M0> -> 594
443 <- 594 < T563_U21051_M0> -> 441
594 <- 441 < T410_U20786_M0> -> 440
441 <- 440 < T409_U20785_M0> -> 438
440 <- 438 < T407_U20783_M0> -> 585
438 <- 585 < T556_U21043_M0> -> 586
585 <- 586 < T555_U21042_M0> -> 584
586 <- 584 < T553_U21041_M0> -> 436
584 <- 436 < T405_U20781_M0> -> 583
436 <- 583 < T552_U21048_M0> -> 582
583 <- 582 < T551_U21047_M0> -> 623
582 <- 623 < T592_U21099_M0> -> 587
623 <- 587 < T554_U21040_M0> -> 431
587 <- 431 < T400_U20775_M0> -> 435
431 <- 435 < T403_U20778_M0> -> 434
435 <- 434 < T404_U20779_M0> -> 432
434 <- 432 < T401_U20776_M0> -> 430
432 <- 430 < T399_U20774_M0> -> 592
430 <- 592 < T561_U21037_M0> -> 591
592 <- 591 < T560_U21036_M0> -> 428
591 <- 428 < T397_U20769_M0> -> 544
428 <- 544 < T513_U20966_M0> -> 395
544 <- 395 < T364_U20713_M0> -> 371
395 <- 371 < T342_U20674_M0> -> 354
371 <- 354 < T323_U20621_M0> -> 346
354 <- 346 < T315_U20602_M0> -> 344
346 <- 344 < T313_U20596_M0> -> 319
344 <- 319 < T288_U20578_M0> -> 291
319 <- 291 < T260_U20481_M0> -> 286
291 <- 286 < T255_U20475_M0> -> 284
286 <- 284 < T253_U20470_M0> -> 282
284 <- 282 < T251_U20559_M0> -> 231
282 <- 231 < T200_U20354_M0> -> 228
231 <- 228 < T197_U20347_M0> -> 221
228 <- 221 < T190_U20350_M0> -> 62
221 <- 62 < T121_U20226_M0> -> 78
62 <- 78 < T88_U20160_M0> -> 69
78 <- 69 < T61_U20113_M0> -> 100
69 <- 100 < T38_U20216_M0> -> 54
100 <- 54 < T23_U20175_M0> -> 89
54 <- 89 < T12_U20220_M0> -> 579
89 <- 579 < T548_U21020_M0> -> 421
579 <- 421 < T390_U20755_M0> -> 418
421 <- 418 < T387_U20751_M0> -> 420
418 <- 420 < T389_U20754_M0> -> 568
420 <- 568 < T537_U21002_M0> -> 569
568 <- 569 < T538_U21004_M0> -> 417
569 <- 417 < T386_U20750_M0> -> 567
417 <- 567 < T536_U21001_M0> -> 566
567 <- 566 < T535_U21000_M0> -> 553
566 <- 553 < T523_U20998_M0> -> 414
553 <- 414 < T383_U20742_M0> -> 413
414 <- 413 < T382_U20741_M0> -> 412
413 <- 412 < T381_U20740_M0> -> 411
412 <- 411 < T380_U20739_M0> -> 404
411 <- 404 < T373_U20738_M0> -> 239
404 <- 239 < T208_U20991_M0> -> 377
239 <- 377 < T346_U20990_M0> -> 564
377 <- 564 < T533_U20994_M0> -> 563
564 <- 563 < T532_U20992_M0> -> 557
563 <- 557 < T526_U20982_M0> -> 405
557 <- 405 < T374_U20729_M0> -> 402
405 <- 402 < T371_U20725_M0> -> 401
402 <- 401 < T370_U20724_M0> -> 403
401 <- 403 < T372_U20727_M0> -> 379
403 <- 379 < T348_U20723_M0> -> 549
379 <- 549 < T518_U20974_M0> -> 398
549 <- 398 < T367_U20720_M0> -> 400
398 <- 400 < T369_U20722_M0> -> 399
400 <- 399 < T368_U20721_M0> -> 548
399 <- 548 < T517_U20971_M0> -> 546
548 <- 546 < T515_U20968_M0> -> 547
546 <- 547 < T516_U20969_M0> -> 543
547 <- 543 < T512_U20965_M0> -> 369
543 <- 369 < T338_U20717_M0> -> 397
369 <- 397 < T366_U20718_M0> -> 542
397 <- 542 < T511_U20963_M0> -> 541
542 <- 541 < T510_U20962_M0> -> 540
541 <- 540 < T509_U20958_M0> -> 393
540 <- 393 < T362_U20711_M0> -> 394
393 <- 394 < T363_U20712_M0> -> 387
394 <- 387 < T356_U20700_M0> -> 386
387 <- 386 < T355_U20698_M0> -> 385
386 <- 385 < T354_U20697_M0> -> 384
385 <- 384 < T353_U20694_M0> -> 372
384 <- 372 < T343_U20691_M0> -> 245
372 <- 245 < T213_U20687_M0> -> 373
245 <- 373 < T345_U20679_M0> -> 374
373 <- 374 < T340_U20677_M0> -> 375
374 <- 375 < T344_U20676_M0> -> 382
375 <- 382 < T351_U20671_M0> -> 381
382 <- 381 < T350_U20670_M0> -> 378
381 <- 378 < T347_U20664_M0> -> 368
378 <- 368 < T337_U20653_M0> -> 367
368 <- 367 < T336_U20648_M0> -> 366
367 <- 366 < T335_U20645_M0> -> 364
366 <- 364 < T333_U20643_M0> -> 363
364 <- 363 < T332_U20642_M0> -> 361
363 <- 361 < T330_U20639_M0> -> 248
361 <- 248 < T214_U20638_M0> -> 327
248 <- 327 < T296_U20636_M0> -> 360
327 <- 360 < T329_U20632_M0> -> 357
360 <- 357 < T326_U20626_M0> -> 358
357 <- 358 < T327_U20627_M0> -> 356
358 <- 356 < T325_U20624_M0> -> 355
356 <- 355 < T324_U20622_M0> -> 222
355 <- 222 < T191_U20619_M0> -> 352
222 <- 352 < T321_U20616_M0> -> 350
352 <- 350 < T319_U20613_M0> -> 348
350 <- 348 < T317_U20605_M0> -> 347
348 <- 347 < T316_U20603_M0> -> 349
347 <- 349 < T318_U20606_M0> -> 343
349 <- 343 < T312_U20595_M0> -> 342
343 <- 342 < T311_U20594_M0> -> 341
342 <- 341 < T310_U20592_M0> -> 340
341 <- 340 < T309_U20587_M0> -> 338
340 <- 338 < T307_U20585_M0> -> 318
338 <- 318 < T287_U20582_M0> -> 337
318 <- 337 < T306_U20581_M0> -> 336
337 <- 336 < T305_U20580_M0> -> 334
336 <- 334 < T303_U20574_M0> -> 335
334 <- 335 < T304_U20575_M0> -> 333
335 <- 333 < T302_U20568_M0> -> 332
333 <- 332 < T301_U20567_M0> -> 330
332 <- 330 < T299_U20563_M0> -> 328
330 <- 328 < T297_U20560_M0> -> 325
328 <- 325 < T294_U20553_M0> -> 326
325 <- 326 < T295_U20554_M0> -> 309
326 <- 309 < T280_U20543_M0> -> 322
309 <- 322 < T291_U20546_M0> -> 312
322 <- 312 < T283_U20541_M0> -> 310
312 <- 310 < T282_U20544_M0> -> 323
310 <- 323 < T292_U20547_M0> -> 308
323 <- 308 < T276_U20537_M0> -> 314
308 <- 314 < T278_U20536_M0> -> 277
314 <- 277 < T246_U20530_M0> -> 320
277 <- 320 < T289_U20532_M0> -> 317
320 <- 317 < T286_U20526_M0> -> 316
317 <- 316 < T285_U20525_M0> -> 315
316 <- 315 < T284_U20523_M0> -> 306
315 <- 306 < T275_U20514_M0> -> 276
306 <- 276 < T245_U20504_M0> -> 303
276 <- 303 < T272_U20501_M0> -> 300
303 <- 300 < T269_U20497_M0> -> 299
300 <- 299 < T268_U20492_M0> -> 297
299 <- 297 < T266_U20489_M0> -> 296
297 <- 296 < T265_U20488_M0> -> 292
296 <- 292 < T261_U20483_M0> -> 294
292 <- 294 < T263_U20485_M0> -> 295
294 <- 295 < T264_U20486_M0> -> 293
295 <- 293 < T262_U20484_M0> -> 290
293 <- 290 < T259_U20480_M0> -> 289
290 <- 289 < T258_U20479_M0> -> 288
289 <- 288 < T257_U20478_M0> -> 281
288 <- 281 < T250_U20467_M0> -> 274
281 <- 274 < T243_U20466_M0> -> 275
274 <- 275 < T244_U20464_M0> -> 280
275 <- 280 < T249_U20458_M0> -> 271
280 <- 271 < T240_U20455_M0> -> 135
271 <- 135 < T71_U20451_M0> -> 278
135 <- 278 < T247_U20447_M0> -> 273
278 <- 273 < T242_U20441_M0> -> 272
273 <- 272 < T241_U20440_M0> -> 241
272 <- 241 < T210_U20438_M0> -> 270
241 <- 270 < T239_U20436_M0> -> 268
270 <- 268 < T237_U20433_M0> -> 258
268 <- 258 < T227_U20432_M0> -> 267
258 <- 267 < T236_U20429_M0> -> 265
267 <- 265 < T234_U20427_M0> -> 266
265 <- 266 < T235_U20428_M0> -> 264
266 <- 264 < T233_U20426_M0> -> 263
264 <- 263 < T232_U20424_M0> -> 262
263 <- 262 < T231_U20419_M0> -> 261
262 <- 261 < T230_U20417_M0> -> 260
261 <- 260 < T229_U20416_M0> -> 229
260 <- 229 < T198_U20411_M0> -> 259
229 <- 259 < T228_U20415_M0> -> 257
259 <- 257 < T226_U20412_M0> -> 119
257 <- 119 < T11_U20406_M0> -> 242
119 <- 242 < T212_U20408_M0> -> 247
242 <- 247 < T211_U20403_M0> -> 246
247 <- 246 < T215_U20400_M0> -> 255
246 <- 255 < T224_U20389_M0> -> 249
255 <- 249 < T216_U20391_M0> -> 253
249 <- 253 < T222_U20386_M0> -> 252
253 <- 252 < T221_U20383_M0> -> 240
252 <- 240 < T209_U20370_M0> -> 250
240 <- 250 < T219_U20381_M0> -> 237
250 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 236
238 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 230
233 <- 230 < T199_U20353_M0> -> 192
230 <- 192 < T161_U20349_M0> -> 226
192 <- 226 < T195_U20345_M0> -> 225
226 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 220
224 <- 220 < T189_U20331_M0> -> 219
220 <- 219 < T188_U20326_M0> -> 218
219 <- 218 < T187_U20325_M0> -> 213
218 <- 213 < T182_U20314_M0> -> 216
213 <- 216 < T185_U20317_M0> -> 212
216 <- 212 < T181_U20313_M0> -> 215
212 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 210
211 <- 210 < T179_U20309_M0> -> 206
210 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 205
208 <- 205 < T174_U20300_M0> -> 203
205 <- 203 < T172_U20298_M0> -> 204
203 <- 204 < T173_U20299_M0> -> 202
204 <- 202 < T171_U20295_M0> -> 201
202 <- 201 < T170_U20291_M0> -> 200
201 <- 200 < T169_U20289_M0> -> 199
200 <- 199 < T168_U20288_M0> -> 196
199 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 103
195 <- 103 < T145_U20282_M0> -> 193
103 <- 193 < T162_U20279_M0> -> 91
193 <- 91 < T19_U20278_M0> -> 189
91 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 182
191 <- 182 < T151_U20265_M0> -> 176
182 <- 176 < T157_U20264_M0> -> 190
176 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 184
158 <- 184 < T135_U20255_M0> -> 180
184 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 188
154 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 160
131 <- 160 < T148_U20238_M0> -> 175
160 <- 175 < T144_U20241_M0> -> 183
175 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 146
152 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 76
166 <- 76 < T20_U20221_M0> -> 138
76 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 56
74 <- 56 < T35_U20203_M0> -> 59
56 <- 59 < T25_U20202_M0> -> 45
59 <- 45 < T76_U20199_M0> -> 39
45 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 148
171 <- 148 < T123_U20185_M0> -> 153
148 <- 153 < T129_U20184_M0> -> 147
153 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 129
170 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 58
144 <- 58 < T77_U20172_M0> -> 117
58 <- 117 < T31_U20168_M0> -> 60
117 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 163
124 <- 163 < T133_U20159_M0> -> 70
163 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 118
114 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 125
105 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 94
42 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 93
167 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 31
132 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 92
43 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 32
113 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 79
109 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 50
73 <- 50 < T80_U19969_M0> -> 120
50 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 251
227 <- 251 < T220_U20382_M0> -> 254
251 <- 254 < T223_U20387_M0> -> 244
254 <- 244 < T218_U20394_M0> -> 243
244 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 269
256 <- 269 < T238_U20435_M0> -> 279
269 <- 279 < T248_U20456_M0> -> 283
279 <- 283 < T252_U20469_M0> -> 285
283 <- 285 < T254_U20474_M0> -> 302
285 <- 302 < T271_U20500_M0> -> 304
302 <- 304 < T273_U20507_M0> -> 305
304 <- 305 < T274_U20509_M0> -> 321
305 <- 321 < T290_U20533_M0> -> 313
321 <- 313 < T281_U20535_M0> -> 311
313 <- 311 < T279_U20542_M0> -> 165
311 <- 165 < T75_U20558_M0> -> 329
165 <- 329 < T298_U20562_M0> -> 339
329 <- 339 < T308_U20586_M0> -> 345
339 <- 345 < T314_U20597_M0> -> 307
345 <- 307 < T277_U20600_M0> -> 351
307 <- 351 < T320_U20615_M0> -> 353
351 <- 353 < T322_U20620_M0> -> 359
353 <- 359 < T328_U20630_M0> -> 362
359 <- 362 < T331_U20641_M0> -> 380
362 <- 380 < T349_U20669_M0> -> 376
380 <- 376 < T339_U20673_M0> -> 370
376 <- 370 < T341_U20690_M0> -> 365
370 <- 365 < T334_U20692_M0> -> 388
365 <- 388 < T357_U20702_M0> -> 389
388 <- 389 < T358_U20703_M0> -> 390
389 <- 390 < T359_U20704_M0> -> 391
390 <- 391 < T360_U20705_M0> -> 392
391 <- 392 < T361_U20707_M0> -> 396
392 <- 396 < T365_U20714_M0> -> 383
396 <- 383 < T352_U20716_M0> -> 406
383 <- 406 < T375_U20731_M0> -> 407
406 <- 407 < T376_U20732_M0> -> 408
407 <- 408 < T377_U20733_M0> -> 409
408 <- 409 < T378_U20734_M0> -> 410
409 <- 410 < T379_U20736_M0> -> 416
410 <- 416 < T385_U20745_M0> -> 287
416 <- 287 < T256_U20748_M0> -> 419
287 <- 419 < T388_U20752_M0> -> 422
419 <- 422 < T391_U20758_M0> -> 423
422 <- 423 < T392_U20760_M0> -> 424
423 <- 424 < T393_U20761_M0> -> 425
424 <- 425 < T394_U20762_M0> -> 426
425 <- 426 < T395_U20763_M0> -> 427
426 <- 427 < T396_U20764_M0> -> 433
427 <- 433 < T402_U20777_M0> -> 437
433 <- 437 < T406_U20782_M0> -> 439
437 <- 439 < T408_U20784_M0> -> 446
439 <- 446 < T415_U20795_M0> -> 447
446 <- 447 < T416_U20796_M0> -> 448
447 <- 448 < T417_U20797_M0> -> 449
448 <- 449 < T418_U20798_M0> -> 450
449 <- 450 < T419_U20799_M0> -> 324
450 <- 324 < T293_U20809_M0> -> 455
324 <- 455 < T424_U20810_M0> -> 456
455 <- 456 < T425_U20812_M0> -> 461
456 <- 461 < T430_U20818_M0> -> 463
461 <- 463 < T432_U20821_M0> -> 464
463 <- 464 < T433_U20823_M0> -> 465
464 <- 465 < T434_U20824_M0> -> 466
465 <- 466 < T435_U20825_M0> -> 467
466 <- 467 < T436_U20827_M0> -> 468
467 <- 468 < T437_U20828_M0> -> 459
468 <- 459 < T428_U20844_M0> -> 472
459 <- 472 < T440_U20854_M0> -> 470
472 <- 470 < T439_U20856_M0> -> 480
470 <- 480 < T449_U20859_M0> -> 481
480 <- 481 < T450_U20860_M0> -> 482
481 <- 482 < T451_U20861_M0> -> 483
482 <- 483 < T452_U20863_M0> -> 484
483 <- 484 < T453_U20864_M0> -> 491
484 <- 491 < T460_U20874_M0> -> 415
491 <- 415 < T384_U20876_M0> -> 498
415 <- 498 < T467_U20887_M0> -> 499
498 <- 499 < T468_U20888_M0> -> 500
499 <- 500 < T469_U20890_M0> -> 501
500 <- 501 < T470_U20891_M0> -> 502
501 <- 502 < T471_U20892_M0> -> 503
502 <- 503 < T472_U20893_M0> -> 504
503 <- 504 < T473_U20896_M0> -> 505
504 <- 505 < T474_U20897_M0> -> 509
505 <- 509 < T478_U20903_M0> -> 510
509 <- 510 < T479_U20908_M0> -> 515
510 <- 515 < T484_U20916_M0> -> 516
515 <- 516 < T485_U20917_M0> -> 517
516 <- 517 < T486_U20918_M0> -> 519
517 <- 519 < T488_U20921_M0> -> 520
519 <- 520 < T489_U20922_M0> -> 521
520 <- 521 < T490_U20923_M0> -> 522
521 <- 522 < T491_U20924_M0> -> 523
522 <- 523 < T492_U20926_M0> -> 524
523 <- 524 < T493_U20927_M0> -> 475
524 <- 475 < T445_U20936_M0> -> 531
475 <- 531 < T500_U20941_M0> -> 533
531 <- 533 < T502_U20946_M0> -> 534
533 <- 534 < T503_U20947_M0> -> 535
534 <- 535 < T504_U20949_M0> -> 536
535 <- 536 < T505_U20950_M0> -> 537
536 <- 537 < T506_U20951_M0> -> 538
537 <- 538 < T507_U20952_M0> -> 539
538 <- 539 < T508_U20953_M0> -> 469
539 <- 469 < T438_U20961_M0> -> 545
469 <- 545 < T514_U20967_M0> -> 550
545 <- 550 < T519_U20975_M0> -> 551
550 <- 551 < T520_U20976_M0> -> 552
551 <- 552 < T521_U20977_M0> -> 558
552 <- 558 < T527_U20984_M0> -> 559
558 <- 559 < T528_U20985_M0> -> 560
559 <- 560 < T529_U20986_M0> -> 561
560 <- 561 < T530_U20987_M0> -> 562
561 <- 562 < T531_U20988_M0> -> 554
562 <- 554 < T522_U20997_M0> -> 570
554 <- 570 < T539_U21007_M0> -> 571
570 <- 571 < T540_U21008_M0> -> 572
571 <- 572 < T541_U21009_M0> -> 573
572 <- 573 < T542_U21010_M0> -> 574
573 <- 574 < T543_U21012_M0> -> 575
574 <- 575 < T544_U21013_M0> -> 576
575 <- 576 < T545_U21014_M0> -> 577
576 <- 577 < T546_U21015_M0> -> 578
577 <- 578 < T547_U21016_M0> -> 590
578 <- 590 < T559_U21035_M0> -> 581
590 <- 581 < T550_U21049_M0> -> 593
581 <- 593 < T562_U21050_M0> -> 595
593 <- 595 < T564_U21053_M0> -> 596
595 <- 596 < T565_U21054_M0> -> 597
596 <- 597 < T566_U21055_M0> -> 598
597 <- 598 < T567_U21056_M0> -> 599
598 <- 599 < T568_U21057_M0> -> 556
599 <- 556 < T525_U21069_M0> -> 609
556 <- 609 < T578_U21075_M0> -> 612
609 <- 612 < T581_U21080_M0> -> 613
612 <- 613 < T582_U21081_M0> -> 614
613 <- 614 < T583_U21082_M0> -> 615
614 <- 615 < T584_U21084_M0> -> 616
615 <- 616 < T585_U21085_M0> -> 617
616 <- 617 < T586_U21086_M0> -> 618
617 <- 618 < T587_U21088_M0> -> 619
618 <- 619 < T588_U21089_M0> -> 565
619 <- 565 < T534_U21096_M0> -> 624
565 <- 624 < T593_U21101_M0> -> 580
624 <- 580 < T549_U21103_M0> -> 631
580 <- 631 < T600_U21110_M0> -> 633
631 <- 633 < T602_U21113_M0> -> 634
633 <- 634 < T603_U21114_M0> -> 635
634 <- 635 < T604_U21115_M0> -> 636
635 <- 636 < T605_U21117_M0> -> 637
636 <- 637 < T606_U21118_M0> -> 588
637 <- 588 < T557_U21129_M0> -> 645
588 <- 645 < T614_U21133_M0> -> 646
645 <- 646 < T615_U21134_M0> -> 650
646 <- 650 < T619_U21141_M0> -> 651
650 <- 651 < T620_U21142_M0> -> 652
651 <- 652 < T621_U21144_M0> -> 653
652 <- 653 < T622_U21145_M0> -> 654
653 <- 654 < T623_U21146_M0> -> 655
654 <- 655 < T624_U21147_M0> -> 656
655 <- 656 < T625_U21148_M0> -> 660
656 <- 660 < T629_U21160_M0> -> 663
660 <- 663 < T632_U21166_M0> -> 665
663 <- 665 < T634_U21169_M0> -> 666
665 <- 666 < T635_U21170_M0> -> 667
666 <- 667 < T636_U21171_M0> -> 668
667 <- 668 < T637_U21172_M0> -> 669
668 <- 669 < T638_U21173_M0> -> 670
669 <- 670 < T639_U21178_M0> -> 672
670 <- 672 < T641_U21180_M0> -> 632
672 <- 632 < T601_U21183_M0> -> 675
632 <- 675 < T644_U21187_M0> -> 683
675 <- 683 < T652_U21198_M0> -> 684
683 <- 684 < T653_U21199_M0> -> 685
684 <- 685 < T654_U21201_M0> -> 686
685 <- 686 < T655_U21202_M0> -> 687
686 <- 687 < T656_U21203_M0> -> 688
687 <- 688 < T657_U21204_M0> -> 689
688 <- 689 < T658_U21205_M0> -> 704
689 <- 704 < T673_U21224_M0> -> 705
704 <- 705 < T674_U21228_M0> -> 706
705 <- 706 < T675_U21230_M0> -> 701
706 <- 701 < T669_U21231_M0> -> 698
701 <- 698 < T671_U21232_M0> -> 700
698 <- 700 < T668_U21233_M0> -> 702
700 <- 702 < T670_U21234_M0> -> 699
702 <- 699 < T666_U21238_M0> -> 697
699 <- 697 < T667_U21239_M0> -> 695
697 <- 695 < T664_U21241_M0> -> 696
695 <- 696 < T665_U21242_M0> -> 707
696 <- 707 < T676_U21243_M0> -> 708
707 <- 708 < T677_U21244_M0> -> 709
708 <- 709 < T678_U21245_M0> -> 710
709 <- 710 < T679_U21249_M0> -> 711
710 <- 711 < T680_U21251_M0> -> 442
711 <- 442 < T411_U21254_M0> -> 712
442 <- 712 < T681_U21255_M0> -> 715
712 <- 715 < T684_U21259_M0> -> 716
715 <- 716 < T685_U21260_M0> -> 717
716 <- 717 < T686_U21261_M0> -> 718
717 <- 718 < T687_U21262_M0> -> 719
718 <- 719 < T688_U21263_M0> -> 720
719 <- 720 < T689_U21264_M0> -> 721
720 <- 721 < T690_U21266_M0> -> 722
721 <- 722 < T691_U21269_M0> -> 723
722 <- 723 < T692_U21270_M0> -> 724
723 <- 724 < T693_U21272_M0> -> 725
724 <- 725 < T694_U21273_M0> -> 726
725 <- 726 < T695_U21274_M0> -> 727
726 <- 727 < T696_U21275_M0> -> 728
727 <- 728 < T697_U21276_M0> -> 729
728 <- 729 < T698_U21280_M0> -> 730
729 <- 730 < T699_U21281_M0> -> 731
730 <- 731 < T700_U21282_M0> -> 732
731 <- 732 < T701_U21283_M0> -> 733
732 <- 733 < T702_U21285_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W0> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W2> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W5> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W6> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W7> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <EmbryoQueue_DIA> (3 requests):
- 1 requests for handler REQ_HANDLER_DELAY
- 1 requests for handler REQ_HANDLER_ADM
- 1 requests for handler REQ_HANDLER_AUTOSECURITY
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (14 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (14 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (15 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (27 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (14 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (16 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20451_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T156_U20224_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (20 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20619_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (15 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T198_U20411_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T199_U20353_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (20 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20991_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20438_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T212_U20408_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20687_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T215_U20400_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T211_U20403_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T214_U20638_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T216_U20391_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T219_U20381_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T227_U20432_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T228_U20415_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T229_U20416_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T230_U20417_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T231_U20419_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T232_U20424_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T233_U20426_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T234_U20427_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T235_U20428_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T236_U20429_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T237_U20433_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T238_U20435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T239_U20436_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T240_U20455_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T241_U20440_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T242_U20441_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T243_U20466_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T244_U20464_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T245_U20504_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T246_U20530_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T247_U20447_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T248_U20456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T249_U20458_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T250_U20467_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T251_U20559_M0> (19 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T252_U20469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T253_U20470_M0> (14 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T254_U20474_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T255_U20475_M0> (19 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T256_U20748_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T257_U20478_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T258_U20479_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T259_U20480_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T260_U20481_M0> (19 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T261_U20483_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T262_U20484_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T263_U20485_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T264_U20486_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T265_U20488_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T266_U20489_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T267_U20801_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T268_U20492_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T269_U20497_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T270_U20802_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T271_U20500_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T272_U20501_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T273_U20507_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T274_U20509_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T275_U20514_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T277_U20600_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T276_U20537_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T280_U20543_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T282_U20544_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T279_U20542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T283_U20541_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T281_U20535_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T278_U20536_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T284_U20523_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T285_U20525_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T286_U20526_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T287_U20582_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T288_U20578_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T289_U20532_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T290_U20533_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T291_U20546_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T292_U20547_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T293_U20809_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T294_U20553_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T295_U20554_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T296_U20636_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T297_U20560_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T298_U20562_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T299_U20563_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T300_U20907_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T301_U20567_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T302_U20568_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T303_U20574_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T304_U20575_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T305_U20580_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T306_U20581_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T307_U20585_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T308_U20586_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T309_U20587_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T310_U20592_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T311_U20594_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T312_U20595_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T313_U20596_M0> (14 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T314_U20597_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T315_U20602_M0> (18 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T316_U20603_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T317_U20605_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T318_U20606_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T319_U20613_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T320_U20615_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T321_U20616_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T322_U20620_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T323_U20621_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T324_U20622_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T325_U20624_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T326_U20626_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T327_U20627_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T328_U20630_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T329_U20632_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T330_U20639_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T331_U20641_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T332_U20642_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T333_U20643_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T334_U20692_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T335_U20645_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T336_U20648_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T337_U20653_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T338_U20717_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T341_U20690_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T342_U20674_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T343_U20691_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T345_U20679_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T340_U20677_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T344_U20676_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T339_U20673_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T346_U20990_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T347_U20664_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T348_U20723_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T349_U20669_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T350_U20670_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T351_U20671_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T352_U20716_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T353_U20694_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T354_U20697_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T355_U20698_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T356_U20700_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T357_U20702_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T358_U20703_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T359_U20704_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T360_U20705_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T361_U20707_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T362_U20711_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T363_U20712_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T364_U20713_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T365_U20714_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T366_U20718_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T367_U20720_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T368_U20721_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T369_U20722_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T370_U20724_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T371_U20725_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T372_U20727_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T373_U20738_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T374_U20729_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T375_U20731_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T376_U20732_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T377_U20733_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T378_U20734_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T379_U20736_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T380_U20739_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T381_U20740_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T382_U20741_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T383_U20742_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T384_U20876_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T385_U20745_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T386_U20750_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T387_U20751_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T388_U20752_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T389_U20754_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T390_U20755_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T391_U20758_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T392_U20760_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T393_U20761_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T394_U20762_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T395_U20763_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T396_U20764_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T397_U20769_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T398_U20877_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T399_U20774_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T400_U20775_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T401_U20776_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T402_U20777_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T404_U20779_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T403_U20778_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T405_U20781_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T406_U20782_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T407_U20783_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T408_U20784_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T409_U20785_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T410_U20786_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T411_U21254_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T412_U20789_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T413_U20790_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T414_U20793_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T415_U20795_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T416_U20796_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T417_U20797_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T418_U20798_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T419_U20799_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T420_U20803_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T421_U20804_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T422_U21176_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T423_U20807_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T424_U20810_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T425_U20812_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T426_U20813_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T427_U20814_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T428_U20844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T429_U20817_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T430_U20818_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T431_U20819_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T432_U20821_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T433_U20823_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T434_U20824_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T435_U20825_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T436_U20827_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T437_U20828_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T438_U20961_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T439_U20856_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T444_U20853_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T440_U20854_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T442_U20851_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T443_U20850_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T445_U20936_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T441_U20847_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T446_U20840_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T447_U20910_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T448_U20845_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T449_U20859_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T450_U20860_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T451_U20861_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T452_U20863_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T453_U20864_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T454_U20867_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T455_U20868_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T456_U20869_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T457_U20870_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T458_U20872_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T459_U20873_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T460_U20874_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T461_U20878_M0> (19 requests):
- 16 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T462_U20880_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T463_U20881_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T464_U20911_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T465_U20884_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T466_U20886_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T467_U20887_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T468_U20888_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T469_U20890_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T470_U20891_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T471_U20892_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T472_U20893_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T473_U20896_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T474_U20897_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T475_U20899_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T476_U20901_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T477_U20902_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T478_U20903_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T479_U20908_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T480_U20909_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T481_U20912_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T482_U20914_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T483_U20929_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T484_U20916_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T485_U20917_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T486_U20918_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T487_U20919_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T488_U20921_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T489_U20922_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T490_U20923_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T491_U20924_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T492_U20926_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T493_U20927_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T494_U20930_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T495_U20931_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T496_U20933_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T497_U20937_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T498_U20938_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T499_U20939_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T500_U20941_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T501_U20942_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T502_U20946_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T503_U20947_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T504_U20949_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T505_U20950_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T506_U20951_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T507_U20952_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T508_U20953_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T509_U20958_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T510_U20962_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T511_U20963_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T512_U20965_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T513_U20966_M0> (20 requests):
- 18 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T514_U20967_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T515_U20968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T516_U20969_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T517_U20971_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T518_U20974_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T519_U20975_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T520_U20976_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T521_U20977_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T523_U20998_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T522_U20997_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T524_U21071_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T525_U21069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T526_U20982_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T527_U20984_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T528_U20985_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T529_U20986_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T530_U20987_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T531_U20988_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T532_U20992_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T533_U20994_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T534_U21096_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T535_U21000_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T536_U21001_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T537_U21002_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T538_U21004_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T539_U21007_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T540_U21008_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T541_U21009_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T542_U21010_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T543_U21012_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T544_U21013_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T545_U21014_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T546_U21015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T547_U21016_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T548_U21020_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T549_U21103_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T550_U21049_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T551_U21047_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T552_U21048_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T553_U21041_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T556_U21043_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T555_U21042_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T554_U21040_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T557_U21129_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T558_U21031_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T559_U21035_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T560_U21036_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T561_U21037_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T562_U21050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T563_U21051_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T564_U21053_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T565_U21054_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T566_U21055_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T567_U21056_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T568_U21057_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T569_U21061_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T570_U21062_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T571_U21063_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T572_U21065_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T573_U21066_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T574_U21067_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T575_U21072_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T576_U21073_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T577_U21097_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T578_U21075_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T579_U21076_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T580_U21079_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T581_U21080_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T582_U21081_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T583_U21082_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T584_U21084_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T585_U21085_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T586_U21086_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T587_U21088_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T588_U21089_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T589_U21092_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T590_U21094_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T591_U21098_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T592_U21099_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T593_U21101_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T594_U21102_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T595_U21104_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T596_U21105_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T597_U21120_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T598_U21108_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T599_U21109_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T600_U21110_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T601_U21183_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T602_U21113_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T603_U21114_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T604_U21115_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T605_U21117_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T606_U21118_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T607_U21121_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T608_U21122_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T609_U21124_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T610_U21125_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T612_U21130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T613_U21131_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T614_U21133_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T615_U21134_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T616_U21135_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T617_U21136_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T618_U21138_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T619_U21141_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T620_U21142_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T621_U21144_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T622_U21145_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T623_U21146_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T624_U21147_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T625_U21148_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T626_U21152_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T627_U21157_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T628_U21159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T629_U21160_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T630_U21161_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T631_U21162_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T632_U21166_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T633_U21167_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T634_U21169_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T635_U21170_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T636_U21171_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T637_U21172_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T638_U21173_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T639_U21178_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T641_U21180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T642_U21184_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T643_U21185_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T644_U21187_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T645_U21188_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T646_U21189_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T647_U21190_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T648_U21191_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T649_U21192_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T650_U21193_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T651_U21197_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T652_U21198_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T653_U21199_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T654_U21201_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T655_U21202_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T656_U21203_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T657_U21204_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T658_U21205_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T659_U21209_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T660_U21210_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T661_U21211_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T662_U21212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T663_U21213_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T664_U21241_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T665_U21242_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T667_U21239_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T671_U21232_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T666_U21238_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T668_U21233_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T669_U21231_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T670_U21234_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T672_U21223_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T673_U21224_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T674_U21228_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T675_U21230_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T676_U21243_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T677_U21244_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T678_U21245_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T679_U21249_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T680_U21251_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T681_U21255_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T682_U21256_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T683_U21257_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T684_U21259_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T685_U21260_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T686_U21261_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T687_U21262_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T688_U21263_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T689_U21264_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T690_U21266_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T691_U21269_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T692_U21270_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T693_U21272_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T694_U21273_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T695_U21274_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T696_U21275_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T697_U21276_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T698_U21280_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T699_U21281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T700_U21282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T701_U21283_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T702_U21285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 05:18:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 187
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |10 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |216|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10514|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |high|T640_U21279_M0 |INTERNAL | | |
9| |000|SAPSYS | |
|
| 4|19804 |DIA |WP_RUN | | |high|T611_U21278_M0 |INTERNAL | | |
9| |000|SAPSYS | |
|
| 5| |DIA |WP_KILL| |10 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |11 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |10 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 05:18:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|14 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|14 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|3 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|16 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|15 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T71_U20451_M0 | | |10.54.36.35 |04:49:29| |
|norm|3 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|3 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|SYNC_RFC |T75_U20558_M0 | | |smprd02.niladv.org |04:53:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|27 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|3 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 | 12448|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|14 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|3 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|3 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|3 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|3 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|3 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|3 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|4 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|4 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|3 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|3 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|20 | | |
0|
|HTTP_NORMAL |T191_U20619_M0 | | |10.54.36.29 |04:56:48| |
|norm|3 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|4 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|15 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|20 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|4 | | |
0|
|HTTP_NORMAL |T208_U20991_M0 | | |10.54.36.11 |05:09:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T210_U20438_M0 | | |10.54.36.15 |04:49:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|6 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T213_U20687_M0 | | |10.54.36.11 |04:59:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T214_U20638_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|3 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|3 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|4 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|6 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|4 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T227_U20432_M0 | | |10.54.36.32 |04:48:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T237_U20433_M0 | | |10.54.36.27 |04:48:50| |
|norm|3 | | |
0|
|SYNC_RFC |T238_U20435_M0 | | |smprd02.niladv.org |04:48:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T239_U20436_M0 | | |10.50.47.10 |04:48:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T240_U20455_M0 | | |10.54.36.29 |04:49:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T241_U20440_M0 | | |10.54.36.25 |04:49:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T242_U20441_M0 | | |10.54.36.36 |04:49:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T243_U20466_M0 | | |10.54.36.37 |04:50:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T244_U20464_M0 | | |10.54.36.13 |04:50:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T245_U20504_M0 | | |10.54.36.33 |04:51:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T246_U20530_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T247_U20447_M0 | | |10.50.47.13 |04:49:12| |
|norm|4 | | |
0|
|SYNC_RFC |T248_U20456_M0 | | |smprd02.niladv.org |04:49:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T249_U20458_M0 | | |10.54.36.33 |04:49:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T250_U20467_M0 | | |10.54.36.28 |04:50:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T251_U20559_M0 | | |10.54.36.32 |04:53:50| |
|norm|19 | | |
0|
|SYNC_RFC |T252_U20469_M0 | | |smprd02.niladv.org |04:50:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T253_U20470_M0 | | |10.54.36.29 |04:50:38| |
|norm|14 | | |
0|
|SYNC_RFC |T254_U20474_M0 | | |smprd02.niladv.org |04:50:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T255_U20475_M0 | | |10.54.36.32 |04:50:50| |
|norm|19 | | |
0|
|SYNC_RFC |T256_U20748_M0 | | |smprd02.niladv.org |05:01:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T257_U20478_M0 | | |10.50.47.10 |04:50:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T258_U20479_M0 | | |10.54.36.40 |04:50:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T259_U20480_M0 | | |10.54.36.34 |04:50:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T260_U20481_M0 | | |10.54.36.19 |04:50:58| |
|norm|19 | | |
0|
|HTTP_NORMAL |T261_U20483_M0 | | |10.54.36.15 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T262_U20484_M0 | | |10.54.36.12 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T263_U20485_M0 | | |10.54.36.17 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T264_U20486_M0 | | |10.54.36.38 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T265_U20488_M0 | | |10.54.36.25 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T266_U20489_M0 | | |10.54.36.30 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T267_U20801_M0 | | |10.54.36.35 |05:03:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T268_U20492_M0 | | |10.50.47.13 |04:51:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T269_U20497_M0 | | |10.54.36.26 |04:51:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T270_U20802_M0 | | |10.54.36.13 |05:03:30| |
|norm|4 | | |
0|
|SYNC_RFC |T271_U20500_M0 | | |smprd02.niladv.org |04:51:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T272_U20501_M0 | | |10.54.36.41 |04:51:39| |
|norm|3 | | |
0|
|SYNC_RFC |T273_U20507_M0 | | |smprd02.niladv.org |04:52:01| |
|norm|1 | | |
0|
|SYNC_RFC |T274_U20509_M0 | | |smprd02.niladv.org |04:52:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T275_U20514_M0 | | |10.54.36.35 |04:52:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T276_U20537_M0 | | |10.54.36.34 |04:52:58| |
|norm|3 | | |
0|
|SYNC_RFC |T277_U20600_M0 | | |smprd02.niladv.org |04:55:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T278_U20536_M0 | | |10.50.47.10 |04:52:53| |
|norm|3 | | |
0|
|SYNC_RFC |T279_U20542_M0 | | |smprd02.niladv.org |04:53:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T280_U20543_M0 | | |10.54.36.12 |04:53:01| |
|norm|6 | | |
0|
|SYNC_RFC |T281_U20535_M0 | | |smprd02.niladv.org |04:52:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T282_U20544_M0 | | |10.54.36.17 |04:53:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T283_U20541_M0 | | |10.54.36.15 |04:53:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T284_U20523_M0 | | |10.54.36.13 |04:52:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T285_U20525_M0 | | |10.54.36.37 |04:52:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T286_U20526_M0 | | |10.54.36.28 |04:52:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T287_U20582_M0 | | |10.54.36.12 |04:55:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T288_U20578_M0 | | |10.54.36.29 |04:54:48| |
|norm|12 | | |
0|
|HTTP_NORMAL |T289_U20532_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|SYNC_RFC |T290_U20533_M0 | | |smprd02.niladv.org |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T291_U20546_M0 | | |10.54.36.38 |04:53:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T292_U20547_M0 | | |10.54.36.30 |04:53:05| |
|norm|4 | | |
0|
|SYNC_RFC |T293_U20809_M0 | | |smprd02.niladv.org |05:03:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T294_U20553_M0 | | |10.54.36.26 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T295_U20554_M0 | | |10.54.36.11 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T296_U20636_M0 | | |10.54.36.14 |04:57:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T297_U20560_M0 | | |10.54.36.27 |04:53:50| |
|norm|5 | | |
0|
|SYNC_RFC |T298_U20562_M0 | | |smprd02.niladv.org |04:53:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T299_U20563_M0 | | |10.54.36.33 |04:53:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T300_U20907_M0 | | |10.50.47.10 |05:06:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T301_U20567_M0 | | |10.54.36.25 |04:54:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T302_U20568_M0 | | |10.54.36.36 |04:54:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T303_U20574_M0 | | |10.54.36.37 |04:54:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T304_U20575_M0 | | |10.54.36.14 |04:54:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T305_U20580_M0 | | |10.50.47.10 |04:54:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T306_U20581_M0 | | |10.54.36.34 |04:54:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T307_U20585_M0 | | |10.54.36.17 |04:55:05| |
|norm|3 | | |
0|
|ASYNC_RFC |T308_U20586_M0 | | |10.54.36.10 |04:55:10| |
|norm|1 | | |
0|
|HTTP_NORMAL |T309_U20587_M0 | | |10.50.47.13 |04:55:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T310_U20592_M0 | | |10.54.36.13 |04:55:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T311_U20594_M0 | | |10.54.36.28 |04:55:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T312_U20595_M0 | | |10.54.36.26 |04:55:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T313_U20596_M0 | | |10.54.36.29 |04:55:38| |
|norm|14 | | |
0|
|SYNC_RFC |T314_U20597_M0 | | |smprd02.niladv.org |04:55:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T315_U20602_M0 | | |10.54.36.19 |04:55:58| |
|norm|18 | | |
0|
|HTTP_NORMAL |T316_U20603_M0 | | |10.54.36.15 |04:56:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T317_U20605_M0 | | |10.54.36.30 |04:56:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T318_U20606_M0 | | |10.54.36.38 |04:56:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T319_U20613_M0 | | |10.54.36.11 |04:56:32| |
|norm|3 | | |
0|
|SYNC_RFC |T320_U20615_M0 | | |smprd02.niladv.org |04:56:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T321_U20616_M0 | | |10.54.36.41 |04:56:40| |
|norm|3 | | |
0|
|SYNC_RFC |T322_U20620_M0 | | |smprd02.niladv.org |04:56:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T323_U20621_M0 | | |10.54.36.32 |04:56:50| |
|norm|11 | | |
0|
|HTTP_NORMAL |T324_U20622_M0 | | |10.54.36.27 |04:56:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T325_U20624_M0 | | |10.54.36.33 |04:56:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T326_U20626_M0 | | |10.54.36.36 |04:57:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T327_U20627_M0 | | |10.54.36.25 |04:57:03| |
|norm|3 | | |
0|
|SYNC_RFC |T328_U20630_M0 | | |smprd02.niladv.org |04:57:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T329_U20632_M0 | | |10.50.47.13 |04:57:13| |
|norm|4 | | |
0|
|HTTP_NORMAL |T330_U20639_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|SYNC_RFC |T331_U20641_M0 | | |smprd02.niladv.org |04:57:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T332_U20642_M0 | | |10.50.47.10 |04:57:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T333_U20643_M0 | | |10.54.36.34 |04:57:58| |
|norm|3 | | |
0|
|SYNC_RFC |T334_U20692_M0 | | |smprd02.niladv.org |04:59:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T335_U20645_M0 | | |10.54.36.12 |04:58:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T336_U20648_M0 | | |10.54.36.17 |04:58:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T337_U20653_M0 | | |10.54.36.35 |04:58:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T338_U20717_M0 | | |10.54.36.32 |05:00:50| |
|norm|3 | | |
0|
|SYNC_RFC |T339_U20673_M0 | | |smprd02.niladv.org |04:58:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T340_U20677_M0 | | |10.54.36.38 |04:59:02| |
|norm|4 | | |
0|
|SYNC_RFC |T341_U20690_M0 | | |smprd02.niladv.org |04:59:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T342_U20674_M0 | | |10.54.36.19 |04:58:58| |
|norm|11 | | |
0|
|HTTP_NORMAL |T343_U20691_M0 | | |10.54.36.29 |04:59:49| |
|norm|9 | | |
0|
|HTTP_NORMAL |T344_U20676_M0 | | |10.54.36.15 |04:59:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T345_U20679_M0 | | |10.54.36.30 |04:59:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T346_U20990_M0 | | |10.54.36.13 |05:09:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T347_U20664_M0 | | |10.54.36.26 |04:58:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T348_U20723_M0 | | |10.54.36.12 |05:01:01| |
|norm|4 | | |
0|
|SYNC_RFC |T349_U20669_M0 | | |smprd02.niladv.org |04:58:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T350_U20670_M0 | | |10.54.36.32 |04:58:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T351_U20671_M0 | | |10.54.36.27 |04:58:50| |
|norm|4 | | |
0|
|SYNC_RFC |T352_U20716_M0 | | |smprd02.niladv.org |05:00:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T353_U20694_M0 | | |10.54.36.33 |04:59:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T354_U20697_M0 | | |10.54.36.25 |05:00:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T355_U20698_M0 | | |10.54.36.36 |05:00:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T356_U20700_M0 | | |10.50.47.13 |05:00:12| |
|norm|3 | | |
0|
|SYNC_RFC |T357_U20702_M0 | | |smprd02.niladv.org |05:00:14| |
|norm|1 | | |
0|
|SYNC_RFC |T358_U20703_M0 | | |smprd02.niladv.org |05:00:16| |
|norm|1 | | |
0|
|SYNC_RFC |T359_U20704_M0 | | |smprd02.niladv.org |05:00:18| |
|norm|1 | | |
0|
|SYNC_RFC |T360_U20705_M0 | | |smprd02.niladv.org |05:00:20| |
|norm|1 | | |
0|
|SYNC_RFC |T361_U20707_M0 | | |smprd02.niladv.org |05:00:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T362_U20711_M0 | | |10.54.36.37 |05:00:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T363_U20712_M0 | | |10.54.36.14 |05:00:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T364_U20713_M0 | | |10.54.36.29 |05:00:38| |
|norm|13 | | |
0|
|SYNC_RFC |T365_U20714_M0 | | |smprd02.niladv.org |05:00:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T366_U20718_M0 | | |10.54.36.27 |05:00:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T367_U20720_M0 | | |10.50.47.10 |05:00:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T368_U20721_M0 | | |10.54.36.40 |05:00:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T369_U20722_M0 | | |10.54.36.34 |05:00:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T370_U20724_M0 | | |10.54.36.15 |05:01:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T371_U20725_M0 | | |10.54.36.17 |05:01:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T372_U20727_M0 | | |10.54.36.38 |05:01:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T373_U20738_M0 | | |10.54.36.35 |05:01:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T374_U20729_M0 | | |10.54.36.30 |05:01:04| |
|norm|3 | | |
0|
|SYNC_RFC |T375_U20731_M0 | | |smprd02.niladv.org |05:01:14| |
|norm|1 | | |
0|
|SYNC_RFC |T376_U20732_M0 | | |smprd02.niladv.org |05:01:16| |
|norm|1 | | |
0|
|SYNC_RFC |T377_U20733_M0 | | |smprd02.niladv.org |05:01:18| |
|norm|1 | | |
0|
|SYNC_RFC |T378_U20734_M0 | | |smprd02.niladv.org |05:01:20| |
|norm|1 | | |
0|
|SYNC_RFC |T379_U20736_M0 | | |smprd02.niladv.org |05:01:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T380_U20739_M0 | | |10.54.36.13 |05:01:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T381_U20740_M0 | | |10.54.36.26 |05:01:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T382_U20741_M0 | | |10.54.36.28 |05:01:32| |
|norm|6 | | |
0|
|HTTP_NORMAL |T383_U20742_M0 | | |10.54.36.11 |05:01:32| |
|norm|3 | | |
0|
|SYNC_RFC |T384_U20876_M0 | | |smprd02.niladv.org |05:05:50| |
|norm|1 | | |
0|
|SYNC_RFC |T385_U20745_M0 | | |smprd02.niladv.org |05:01:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T386_U20750_M0 | | |10.54.36.19 |05:01:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T387_U20751_M0 | | |10.54.36.33 |05:01:58| |
|norm|3 | | |
0|
|SYNC_RFC |T388_U20752_M0 | | |smprd02.niladv.org |05:02:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T389_U20754_M0 | | |10.54.36.25 |05:02:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T390_U20755_M0 | | |10.54.36.36 |05:02:04| |
|norm|3 | | |
0|
|SYNC_RFC |T391_U20758_M0 | | |smprd02.niladv.org |05:02:11| |
|norm|1 | | |
0|
|SYNC_RFC |T392_U20760_M0 | | |smprd02.niladv.org |05:02:14| |
|norm|1 | | |
0|
|SYNC_RFC |T393_U20761_M0 | | |smprd02.niladv.org |05:02:16| |
|norm|1 | | |
0|
|SYNC_RFC |T394_U20762_M0 | | |smprd02.niladv.org |05:02:18| |
|norm|1 | | |
0|
|SYNC_RFC |T395_U20763_M0 | | |smprd02.niladv.org |05:02:20| |
|norm|1 | | |
0|
|SYNC_RFC |T396_U20764_M0 | | |smprd02.niladv.org |05:02:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T397_U20769_M0 | | |10.54.36.37 |05:02:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T398_U20877_M0 | | |10.54.36.27 |05:05:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T399_U20774_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T400_U20775_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T401_U20776_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|SYNC_RFC |T402_U20777_M0 | | |smprd02.niladv.org |05:02:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T403_U20778_M0 | | |10.54.36.27 |05:02:50| |
|norm|5 | | |
0|
|HTTP_NORMAL |T404_U20779_M0 | | |10.54.36.32 |05:02:50| |
|norm|13 | | |
0|
|HTTP_NORMAL |T405_U20781_M0 | | |10.50.47.10 |05:02:53| |
|norm|3 | | |
0|
|SYNC_RFC |T406_U20782_M0 | | |smprd02.niladv.org |05:02:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T407_U20783_M0 | | |10.54.36.34 |05:02:58| |
|norm|3 | | |
0|
|SYNC_RFC |T408_U20784_M0 | | |smprd02.niladv.org |05:03:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T409_U20785_M0 | | |10.54.36.12 |05:03:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T410_U20786_M0 | | |10.54.36.17 |05:03:02| |
|norm|3 | | |
0|
|SYNC_RFC |T411_U21254_M0 | | |smprd02.niladv.org |05:17:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T412_U20789_M0 | | |10.54.36.38 |05:03:04| |
|norm|6 | | |
0|
|HTTP_NORMAL |T413_U20790_M0 | | |10.54.36.30 |05:03:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T414_U20793_M0 | | |10.50.47.13 |05:03:11| |
|norm|5 | | |
0|
|SYNC_RFC |T415_U20795_M0 | | |smprd02.niladv.org |05:03:14| |
|norm|1 | | |
0|
|SYNC_RFC |T416_U20796_M0 | | |smprd02.niladv.org |05:03:16| |
|norm|1 | | |
0|
|SYNC_RFC |T417_U20797_M0 | | |smprd02.niladv.org |05:03:18| |
|norm|1 | | |
0|
|SYNC_RFC |T418_U20798_M0 | | |smprd02.niladv.org |05:03:20| |
|norm|1 | | |
0|
|SYNC_RFC |T419_U20799_M0 | | |smprd02.niladv.org |05:03:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T420_U20803_M0 | | |10.54.36.26 |05:03:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T421_U20804_M0 | | |10.54.36.28 |05:03:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T422_U21176_M0 | | |10.54.36.28 |05:15:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T423_U20807_M0 | | |10.54.36.41 |05:03:39| |
|norm|3 | | |
0|
|SYNC_RFC |T424_U20810_M0 | | |smprd02.niladv.org |05:03:50| |
|norm|1 | | |
0|
|SYNC_RFC |T425_U20812_M0 | | |smprd02.niladv.org |05:03:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T426_U20813_M0 | | |10.54.36.19 |05:03:58| |
|norm|10 | | |
0|
|HTTP_NORMAL |T427_U20814_M0 | | |10.54.36.33 |05:03:58| |
|norm|3 | | |
0|
|SYNC_RFC |T428_U20844_M0 | | |smprd02.niladv.org |05:04:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T429_U20817_M0 | | |10.54.36.25 |05:04:03| |
|norm|4 | | |
0|
|SYNC_RFC |T430_U20818_M0 | | |smprd02.niladv.org |05:04:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T431_U20819_M0 | | |10.54.36.36 |05:04:04| |
|norm|4 | | |
0|
|SYNC_RFC |T432_U20821_M0 | | |smprd02.niladv.org |05:04:10| |
|norm|1 | | |
0|
|SYNC_RFC |T433_U20823_M0 | | |smprd02.niladv.org |05:04:14| |
|norm|1 | | |
0|
|SYNC_RFC |T434_U20824_M0 | | |smprd02.niladv.org |05:04:16| |
|norm|1 | | |
0|
|SYNC_RFC |T435_U20825_M0 | | |smprd02.niladv.org |05:04:18| |
|norm|1 | | |
0|
|SYNC_RFC |T436_U20827_M0 | | |smprd02.niladv.org |05:04:20| |
|norm|1 | | |
0|
|SYNC_RFC |T437_U20828_M0 | | |smprd02.niladv.org |05:04:22| |
|norm|1 | | |
0|
|SYNC_RFC |T438_U20961_M0 | | |smprd02.niladv.org |05:08:46| |
|norm|1 | | |
0|
|SYNC_RFC |T439_U20856_M0 | | |smprd02.niladv.org |05:05:10| |
|norm|1 | | |
0|
|SYNC_RFC |T440_U20854_M0 | | |smprd02.niladv.org |05:05:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T441_U20847_M0 | | |10.50.47.10 |05:04:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T442_U20851_M0 | | |10.54.36.17 |05:05:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T443_U20850_M0 | | |10.54.36.12 |05:05:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T444_U20853_M0 | | |10.54.36.15 |05:05:03| |
|norm|5 | | |
0|
|SYNC_RFC |T445_U20936_M0 | | |smprd02.niladv.org |05:07:45| |
|norm|1 | | |
0|
|HTTP_NORMAL |T446_U20840_M0 | | |10.54.36.37 |05:04:33| |
|norm|5 | | |
0|
|HTTP_NORMAL |T447_U20910_M0 | | |10.54.36.12 |05:07:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T448_U20845_M0 | | |10.54.36.29 |05:04:49| |
|norm|3 | | |
0|
|SYNC_RFC |T449_U20859_M0 | | |smprd02.niladv.org |05:05:14| |
|norm|1 | | |
0|
|SYNC_RFC |T450_U20860_M0 | | |smprd02.niladv.org |05:05:16| |
|norm|1 | | |
0|
|SYNC_RFC |T451_U20861_M0 | | |smprd02.niladv.org |05:05:18| |
|norm|1 | | |
0|
|SYNC_RFC |T452_U20863_M0 | | |smprd02.niladv.org |05:05:20| |
|norm|1 | | |
0|
|SYNC_RFC |T453_U20864_M0 | | |smprd02.niladv.org |05:05:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T454_U20867_M0 | | |10.54.36.35 |05:05:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T455_U20868_M0 | | |10.54.36.13 |05:05:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T456_U20869_M0 | | |10.54.36.26 |05:05:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T457_U20870_M0 | | |10.54.36.28 |05:05:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T458_U20872_M0 | | |10.54.36.29 |05:05:38| |
|norm|12 | | |
0|
|HTTP_NORMAL |T459_U20873_M0 | | |10.54.36.41 |05:05:40| |
|norm|3 | | |
0|
|SYNC_RFC |T460_U20874_M0 | | |smprd02.niladv.org |05:05:40| |
|norm|1 | | |
0|
|HTTP_NORMAL |T461_U20878_M0 | | |10.54.36.32 |05:05:51| |
|norm|19 | | |
0|
|HTTP_NORMAL |T462_U20880_M0 | | |10.54.36.19 |05:05:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T463_U20881_M0 | | |10.54.36.33 |05:05:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T464_U20911_M0 | | |10.54.36.17 |05:07:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T465_U20884_M0 | | |10.54.36.30 |05:06:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T466_U20886_M0 | | |10.54.36.38 |05:06:03| |
|norm|6 | | |
0|
|SYNC_RFC |T467_U20887_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T468_U20888_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T469_U20890_M0 | | |smprd02.niladv.org |05:06:14| |
|norm|1 | | |
0|
|SYNC_RFC |T470_U20891_M0 | | |smprd02.niladv.org |05:06:15| |
|norm|1 | | |
0|
|SYNC_RFC |T471_U20892_M0 | | |smprd02.niladv.org |05:06:17| |
|norm|1 | | |
0|
|SYNC_RFC |T472_U20893_M0 | | |smprd02.niladv.org |05:06:18| |
|norm|1 | | |
0|
|SYNC_RFC |T473_U20896_M0 | | |smprd02.niladv.org |05:06:20| |
|norm|1 | | |
0|
|SYNC_RFC |T474_U20897_M0 | | |smprd02.niladv.org |05:06:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T475_U20899_M0 | | |10.54.36.11 |05:06:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T476_U20901_M0 | | |10.54.36.37 |05:06:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T477_U20902_M0 | | |10.54.36.14 |05:06:37| |
|norm|3 | | |
0|
|SYNC_RFC |T478_U20903_M0 | | |smprd02.niladv.org |05:06:40| |
|norm|1 | | |
0|
|SYNC_RFC |T479_U20908_M0 | | |smprd02.niladv.org |05:06:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T480_U20909_M0 | | |10.54.36.34 |05:06:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T481_U20912_M0 | | |10.54.36.25 |05:07:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T482_U20914_M0 | | |10.54.36.36 |05:07:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T483_U20929_M0 | | |10.54.36.35 |05:07:29| |
|norm|3 | | |
0|
|SYNC_RFC |T484_U20916_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T485_U20917_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T486_U20918_M0 | | |smprd02.niladv.org |05:07:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T487_U20919_M0 | | |10.50.47.13 |05:07:12| |
|norm|3 | | |
0|
|SYNC_RFC |T488_U20921_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T489_U20922_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T490_U20923_M0 | | |smprd02.niladv.org |05:07:17| |
|norm|1 | | |
0|
|SYNC_RFC |T491_U20924_M0 | | |smprd02.niladv.org |05:07:19| |
|norm|1 | | |
0|
|SYNC_RFC |T492_U20926_M0 | | |smprd02.niladv.org |05:07:21| |
|norm|1 | | |
0|
|SYNC_RFC |T493_U20927_M0 | | |smprd02.niladv.org |05:07:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T494_U20930_M0 | | |10.54.36.13 |05:07:31| |
|norm|4 | | |
0|
|HTTP_NORMAL |T495_U20931_M0 | | |10.54.36.26 |05:07:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T496_U20933_M0 | | |10.54.36.28 |05:07:34| |
|norm|4 | | |
0|
|HTTP_NORMAL |T497_U20937_M0 | | |10.54.36.29 |05:07:48| |
|norm|10 | | |
0|
|HTTP_NORMAL |T498_U20938_M0 | | |10.54.36.29 |05:07:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T499_U20939_M0 | | |10.54.36.29 |05:07:49| |
|norm|3 | | |
0|
|SYNC_RFC |T500_U20941_M0 | | |smprd02.niladv.org |05:07:59| |
|norm|1 | | |
0|
|HTTP_NORMAL |T501_U20942_M0 | | |10.54.36.15 |05:08:01| |
|norm|5 | | |
0|
|SYNC_RFC |T502_U20946_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T503_U20947_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T504_U20949_M0 | | |smprd02.niladv.org |05:08:15| |
|norm|1 | | |
0|
|SYNC_RFC |T505_U20950_M0 | | |smprd02.niladv.org |05:08:17| |
|norm|1 | | |
0|
|SYNC_RFC |T506_U20951_M0 | | |smprd02.niladv.org |05:08:19| |
|norm|1 | | |
0|
|SYNC_RFC |T507_U20952_M0 | | |smprd02.niladv.org |05:08:21| |
|norm|1 | | |
0|
|SYNC_RFC |T508_U20953_M0 | | |smprd02.niladv.org |05:08:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T509_U20958_M0 | | |10.54.36.41 |05:08:40| |
|norm|2 | | |
0|
|HTTP_NORMAL |T510_U20962_M0 | | |10.54.36.32 |05:08:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T511_U20963_M0 | | |10.54.36.27 |05:08:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T512_U20965_M0 | | |10.54.36.33 |05:08:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T513_U20966_M0 | | |10.54.36.19 |05:08:58| |
|norm|20 | | |
0|
|SYNC_RFC |T514_U20967_M0 | | |smprd02.niladv.org |05:09:00| |
|norm|1 | | |
0|
|HTTP_NORMAL |T515_U20968_M0 | | |10.54.36.12 |05:09:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T516_U20969_M0 | | |10.54.36.30 |05:09:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T517_U20971_M0 | | |10.54.36.38 |05:09:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T518_U20974_M0 | | |10.54.36.17 |05:09:05| |
|norm|2 | | |
0|
|SYNC_RFC |T519_U20975_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T520_U20976_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T521_U20977_M0 | | |smprd02.niladv.org |05:09:08| |
|norm|1 | | |
0|
|SYNC_RFC |T522_U20997_M0 | | |smprd02.niladv.org |05:09:46| |
|norm|1 | | |
0|
|HTTP_NORMAL |T523_U20998_M0 | | |10.54.36.29 |05:09:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T524_U21071_M0 | | |10.50.47.10 |05:11:53| |
|norm|2 | | |
0|
|SYNC_RFC |T525_U21069_M0 | | |smprd02.niladv.org |05:11:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T526_U20982_M0 | | |10.50.47.13 |05:09:12| |
|norm|3 | | |
0|
|SYNC_RFC |T527_U20984_M0 | | |smprd02.niladv.org |05:09:15| |
|norm|1 | | |
0|
|SYNC_RFC |T528_U20985_M0 | | |smprd02.niladv.org |05:09:17| |
|norm|1 | | |
0|
|SYNC_RFC |T529_U20986_M0 | | |smprd02.niladv.org |05:09:19| |
|norm|1 | | |
0|
|SYNC_RFC |T530_U20987_M0 | | |smprd02.niladv.org |05:09:21| |
|norm|1 | | |
0|
|SYNC_RFC |T531_U20988_M0 | | |smprd02.niladv.org |05:09:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T532_U20992_M0 | | |10.54.36.37 |05:09:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T533_U20994_M0 | | |10.54.36.26 |05:09:33| |
|norm|2 | | |
0|
|SYNC_RFC |T534_U21096_M0 | | |smprd02.niladv.org |05:12:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T535_U21000_M0 | | |10.50.47.10 |05:09:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T536_U21001_M0 | | |10.54.36.34 |05:09:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T537_U21002_M0 | | |10.54.36.25 |05:10:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T538_U21004_M0 | | |10.54.36.36 |05:10:04| |
|norm|2 | | |
0|
|SYNC_RFC |T539_U21007_M0 | | |smprd02.niladv.org |05:10:06| |
|norm|1 | | |
0|
|SYNC_RFC |T540_U21008_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T541_U21009_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T542_U21010_M0 | | |smprd02.niladv.org |05:10:09| |
|norm|1 | | |
0|
|SYNC_RFC |T543_U21012_M0 | | |smprd02.niladv.org |05:10:15| |
|norm|1 | | |
0|
|SYNC_RFC |T544_U21013_M0 | | |smprd02.niladv.org |05:10:17| |
|norm|1 | | |
0|
|SYNC_RFC |T545_U21014_M0 | | |smprd02.niladv.org |05:10:19| |
|norm|1 | | |
0|
|SYNC_RFC |T546_U21015_M0 | | |smprd02.niladv.org |05:10:21| |
|norm|1 | | |
0|
|SYNC_RFC |T547_U21016_M0 | | |smprd02.niladv.org |05:10:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T548_U21020_M0 | | |10.54.36.35 |05:10:27| |
|norm|2 | | |
0|
|SYNC_RFC |T549_U21103_M0 | | |smprd02.niladv.org |05:13:01| |
|norm|1 | | |
0|
|SYNC_RFC |T550_U21049_M0 | | |smprd02.niladv.org |05:11:08| |
|norm|1 | | |
0|
|HTTP_NORMAL |T551_U21047_M0 | | |10.54.36.38 |05:11:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T552_U21048_M0 | | |10.54.36.17 |05:11:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T553_U21041_M0 | | |10.54.36.15 |05:11:00| |
|norm|2 | | |
0|
|HTTP_NORMAL |T554_U21040_M0 | | |10.54.36.33 |05:10:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T555_U21042_M0 | | |10.54.36.12 |05:11:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T556_U21043_M0 | | |10.54.36.30 |05:11:03| |
|norm|2 | | |
0|
|SYNC_RFC |T557_U21129_M0 | | |smprd02.niladv.org |05:13:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T558_U21031_M0 | | |10.54.36.29 |05:10:38| |
|norm|10 | | |
0|
|SYNC_RFC |T559_U21035_M0 | | |smprd02.niladv.org |05:10:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T560_U21036_M0 | | |10.54.36.32 |05:10:50| |
|norm|9 | | |
0|
|HTTP_NORMAL |T561_U21037_M0 | | |10.54.36.27 |05:10:50| |
|norm|3 | | |
0|
|SYNC_RFC |T562_U21050_M0 | | |smprd02.niladv.org |05:11:09| |
|norm|1 | | |
0|
|HTTP_NORMAL |T563_U21051_M0 | | |10.50.47.13 |05:11:12| |
|norm|2 | | |
0|
|SYNC_RFC |T564_U21053_M0 | | |smprd02.niladv.org |05:11:15| |
|norm|1 | | |
0|
|SYNC_RFC |T565_U21054_M0 | | |smprd02.niladv.org |05:11:17| |
|norm|1 | | |
0|
|SYNC_RFC |T566_U21055_M0 | | |smprd02.niladv.org |05:11:19| |
|norm|1 | | |
0|
|SYNC_RFC |T567_U21056_M0 | | |smprd02.niladv.org |05:11:21| |
|norm|1 | | |
0|
|SYNC_RFC |T568_U21057_M0 | | |smprd02.niladv.org |05:11:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T569_U21061_M0 | | |10.54.36.13 |05:11:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T570_U21062_M0 | | |10.54.36.11 |05:11:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T571_U21063_M0 | | |10.54.36.37 |05:11:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T572_U21065_M0 | | |10.54.36.26 |05:11:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T573_U21066_M0 | | |10.54.36.14 |05:11:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T574_U21067_M0 | | |10.54.36.41 |05:11:39| |
|norm|2 | | |
0|
|HTTP_NORMAL |T575_U21072_M0 | | |10.54.36.19 |05:11:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T576_U21073_M0 | | |10.54.36.34 |05:11:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T577_U21097_M0 | | |10.54.36.29 |05:12:49| |
|norm|2 | | |
0|
|SYNC_RFC |T578_U21075_M0 | | |smprd02.niladv.org |05:12:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T579_U21076_M0 | | |10.54.36.25 |05:12:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T580_U21079_M0 | | |10.54.36.36 |05:12:05| |
|norm|2 | | |
0|
|SYNC_RFC |T581_U21080_M0 | | |smprd02.niladv.org |05:12:08| |
|norm|1 | | |
0|
|SYNC_RFC |T582_U21081_M0 | | |smprd02.niladv.org |05:12:10| |
|norm|1 | | |
0|
|SYNC_RFC |T583_U21082_M0 | | |smprd02.niladv.org |05:12:11| |
|norm|1 | | |
0|
|SYNC_RFC |T584_U21084_M0 | | |smprd02.niladv.org |05:12:15| |
|norm|1 | | |
0|
|SYNC_RFC |T585_U21085_M0 | | |smprd02.niladv.org |05:12:17| |
|norm|1 | | |
0|
|SYNC_RFC |T586_U21086_M0 | | |smprd02.niladv.org |05:12:19| |
|norm|1 | | |
0|
|SYNC_RFC |T587_U21088_M0 | | |smprd02.niladv.org |05:12:21| |
|norm|1 | | |
0|
|SYNC_RFC |T588_U21089_M0 | | |smprd02.niladv.org |05:12:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T589_U21092_M0 | | |10.54.36.35 |05:12:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T590_U21094_M0 | | |10.54.36.28 |05:12:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T591_U21098_M0 | | |10.54.36.29 |05:12:49| |
|norm|2 | | |
0|
|HTTP_NORMAL |T592_U21099_M0 | | |10.54.36.29 |05:12:49| |
|norm|2 | | |
0|
|SYNC_RFC |T593_U21101_M0 | | |smprd02.niladv.org |05:12:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T594_U21102_M0 | | |10.54.36.33 |05:12:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T595_U21104_M0 | | |10.54.36.15 |05:13:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T596_U21105_M0 | | |10.54.36.12 |05:13:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T597_U21120_M0 | | |10.54.36.11 |05:13:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T598_U21108_M0 | | |10.54.36.30 |05:13:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T599_U21109_M0 | | |10.54.36.17 |05:13:05| |
|norm|2 | | |
0|
|SYNC_RFC |T600_U21110_M0 | | |smprd02.niladv.org |05:13:08| |
|norm|1 | | |
0|
|SYNC_RFC |T601_U21183_M0 | | |smprd02.niladv.org |05:15:48| |
|norm|1 | | |
0|
|SYNC_RFC |T602_U21113_M0 | | |smprd02.niladv.org |05:13:15| |
|norm|1 | | |
0|
|SYNC_RFC |T603_U21114_M0 | | |smprd02.niladv.org |05:13:17| |
|norm|1 | | |
0|
|SYNC_RFC |T604_U21115_M0 | | |smprd02.niladv.org |05:13:19| |
|norm|1 | | |
0|
|SYNC_RFC |T605_U21117_M0 | | |smprd02.niladv.org |05:13:21| |
|norm|1 | | |
0|
|SYNC_RFC |T606_U21118_M0 | | |smprd02.niladv.org |05:13:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T607_U21121_M0 | | |10.54.36.13 |05:13:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T608_U21122_M0 | | |10.54.36.37 |05:13:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T609_U21124_M0 | | |10.54.36.26 |05:13:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T610_U21125_M0 | | |10.54.36.14 |05:13:37| |
|norm|2 | | |
0|
|INTERNAL |T611_U21278_M0 |000|SAPSYS | |05:18:34|4 |
|high| | | |
4200|
|HTTP_NORMAL |T612_U21130_M0 | | |10.54.36.32 |05:13:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T613_U21131_M0 | | |10.54.36.27 |05:13:50| |
|norm|3 | | |
0|
|SYNC_RFC |T614_U21133_M0 | | |smprd02.niladv.org |05:13:54| |
|norm|1 | | |
0|
|SYNC_RFC |T615_U21134_M0 | | |smprd02.niladv.org |05:13:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T616_U21135_M0 | | |10.54.36.19 |05:13:58| |
|norm|9 | | |
0|
|HTTP_NORMAL |T617_U21136_M0 | | |10.54.36.34 |05:13:59| |
|norm|5 | | |
0|
|HTTP_NORMAL |T618_U21138_M0 | | |10.54.36.38 |05:14:03| |
|norm|4 | | |
0|
|SYNC_RFC |T619_U21141_M0 | | |smprd02.niladv.org |05:14:08| |
|norm|1 | | |
0|
|SYNC_RFC |T620_U21142_M0 | | |smprd02.niladv.org |05:14:10| |
|norm|1 | | |
0|
|SYNC_RFC |T621_U21144_M0 | | |smprd02.niladv.org |05:14:15| |
|norm|1 | | |
0|
|SYNC_RFC |T622_U21145_M0 | | |smprd02.niladv.org |05:14:17| |
|norm|1 | | |
0|
|SYNC_RFC |T623_U21146_M0 | | |smprd02.niladv.org |05:14:19| |
|norm|1 | | |
0|
|SYNC_RFC |T624_U21147_M0 | | |smprd02.niladv.org |05:14:21| |
|norm|1 | | |
0|
|SYNC_RFC |T625_U21148_M0 | | |smprd02.niladv.org |05:14:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T626_U21152_M0 | | |10.54.36.35 |05:14:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T627_U21157_M0 | | |10.54.36.29 |05:14:50| |
|norm|6 | | |
0|
|HTTP_NORMAL |T628_U21159_M0 | | |10.50.47.10 |05:14:53| |
|norm|3 | | |
0|
|SYNC_RFC |T629_U21160_M0 | | |smprd02.niladv.org |05:14:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T630_U21161_M0 | | |10.54.36.36 |05:15:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T631_U21162_M0 | | |10.54.36.25 |05:15:03| |
|norm|2 | | |
0|
|SYNC_RFC |T632_U21166_M0 | | |smprd02.niladv.org |05:15:08| |
|norm|1 | | |
0|
|HTTP_NORMAL |T633_U21167_M0 | | |10.50.47.13 |05:15:12| |
|norm|5 | | |
0|
|SYNC_RFC |T634_U21169_M0 | | |smprd02.niladv.org |05:15:15| |
|norm|1 | | |
0|
|SYNC_RFC |T635_U21170_M0 | | |smprd02.niladv.org |05:15:17| |
|norm|1 | | |
0|
|SYNC_RFC |T636_U21171_M0 | | |smprd02.niladv.org |05:15:19| |
|norm|1 | | |
0|
|SYNC_RFC |T637_U21172_M0 | | |smprd02.niladv.org |05:15:21| |
|norm|1 | | |
0|
|SYNC_RFC |T638_U21173_M0 | | |smprd02.niladv.org |05:15:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T639_U21178_M0 | | |10.54.36.29 |05:15:39| |
|norm|9 | | |
0|
|INTERNAL |T640_U21279_M0 |000|SAPSYS | |05:18:34|3 |
|high| | | |
4200|
|SYNC_RFC |T641_U21180_M0 | | |smprd02.niladv.org |05:15:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T642_U21184_M0 | | |10.54.36.27 |05:15:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T643_U21185_M0 | | |10.54.36.32 |05:15:51| |
|norm|14 | | |
0|
|SYNC_RFC |T644_U21187_M0 | | |smprd02.niladv.org |05:15:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T645_U21188_M0 | | |10.54.36.33 |05:15:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T646_U21189_M0 | | |10.54.36.34 |05:15:59| |
|norm|2 | | |
0|
|HTTP_NORMAL |T647_U21190_M0 | | |10.54.36.15 |05:16:00| |
|norm|4 | | |
0|
|HTTP_NORMAL |T648_U21191_M0 | | |10.54.36.12 |05:16:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T649_U21192_M0 | | |10.54.36.17 |05:16:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T650_U21193_M0 | | |10.54.36.30 |05:16:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T651_U21197_M0 | | |10.54.36.38 |05:16:04| |
|norm|5 | | |
0|
|SYNC_RFC |T652_U21198_M0 | | |smprd02.niladv.org |05:16:08| |
|norm|1 | | |
0|
|SYNC_RFC |T653_U21199_M0 | | |smprd02.niladv.org |05:16:10| |
|norm|1 | | |
0|
|SYNC_RFC |T654_U21201_M0 | | |smprd02.niladv.org |05:16:15| |
|norm|1 | | |
0|
|SYNC_RFC |T655_U21202_M0 | | |smprd02.niladv.org |05:16:17| |
|norm|1 | | |
0|
|SYNC_RFC |T656_U21203_M0 | | |smprd02.niladv.org |05:16:19| |
|norm|1 | | |
0|
|SYNC_RFC |T657_U21204_M0 | | |smprd02.niladv.org |05:16:21| |
|norm|1 | | |
0|
|SYNC_RFC |T658_U21205_M0 | | |smprd02.niladv.org |05:16:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T659_U21209_M0 | | |10.54.36.35 |05:16:29| |
|norm|2 | | |
0|
|HTTP_NORMAL |T660_U21210_M0 | | |10.54.36.13 |05:16:30| |
|norm|5 | | |
0|
|HTTP_NORMAL |T661_U21211_M0 | | |10.54.36.37 |05:16:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T662_U21212_M0 | | |10.54.36.26 |05:16:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T663_U21213_M0 | | |10.54.36.11 |05:16:32| |
|norm|2 | | |
0|
|SYNC_RFC |T664_U21241_M0 | | |smprd02.niladv.org |05:17:15| |
|norm|1 | | |
0|
|SYNC_RFC |T665_U21242_M0 | | |smprd02.niladv.org |05:17:17| |
|norm|1 | | |
0|
|SYNC_RFC |T666_U21238_M0 | | |smprd02.niladv.org |05:17:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T667_U21239_M0 | | |10.50.47.13 |05:17:13| |
|norm|1 | | |
0|
|HTTP_NORMAL |T668_U21233_M0 | | |10.54.36.36 |05:17:03| |
|norm|2 | | |
0|
|SYNC_RFC |T669_U21231_M0 | | |smprd02.niladv.org |05:16:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T670_U21234_M0 | | |10.54.36.25 |05:17:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T671_U21232_M0 | | |10.54.36.19 |05:16:57| |
|norm|4 | | |
0|
|HTTP_NORMAL |T672_U21223_M0 | | |10.54.36.14 |05:16:35| |
|norm|3 | | |
0|
|SYNC_RFC |T673_U21224_M0 | | |smprd02.niladv.org |05:16:41| |
|norm|1 | | |
0|
|SYNC_RFC |T674_U21228_M0 | | |smprd02.niladv.org |05:16:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T675_U21230_M0 | | |10.50.47.10 |05:16:54| |
|norm|2 | | |
0|
|SYNC_RFC |T676_U21243_M0 | | |smprd02.niladv.org |05:17:19| |
|norm|1 | | |
0|
|SYNC_RFC |T677_U21244_M0 | | |smprd02.niladv.org |05:17:21| |
|norm|1 | | |
0|
|SYNC_RFC |T678_U21245_M0 | | |smprd02.niladv.org |05:17:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T679_U21249_M0 | | |10.54.36.28 |05:17:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T680_U21251_M0 | | |10.54.36.41 |05:17:40| |
|norm|1 | | |
0|
|HTTP_NORMAL |T681_U21255_M0 | | |10.54.36.29 |05:17:49| |
|norm|4 | | |
0|
|HTTP_NORMAL |T682_U21256_M0 | | |10.54.36.29 |05:17:49| |
|norm|2 | | |
0|
|HTTP_NORMAL |T683_U21257_M0 | | |10.54.36.29 |05:17:49| |
|norm|2 | | |
0|
|SYNC_RFC |T684_U21259_M0 | | |smprd02.niladv.org |05:17:54| |
|norm|1 | | |
0|
|SYNC_RFC |T685_U21260_M0 | | |smprd02.niladv.org |05:17:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T686_U21261_M0 | | |10.54.36.33 |05:17:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T687_U21262_M0 | | |10.54.36.12 |05:18:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T688_U21263_M0 | | |10.54.36.15 |05:18:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T689_U21264_M0 | | |10.54.36.17 |05:18:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T690_U21266_M0 | | |10.54.36.30 |05:18:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T691_U21269_M0 | | |10.54.36.38 |05:18:04| |
|norm|3 | | |
0|
|SYNC_RFC |T692_U21270_M0 | | |smprd02.niladv.org |05:18:10| |
|norm|1 | | |
0|
|SYNC_RFC |T693_U21272_M0 | | |smprd02.niladv.org |05:18:15| |
|norm|1 | | |
0|
|SYNC_RFC |T694_U21273_M0 | | |smprd02.niladv.org |05:18:17| |
|norm|1 | | |
0|
|SYNC_RFC |T695_U21274_M0 | | |smprd02.niladv.org |05:18:19| |
|norm|1 | | |
0|
|SYNC_RFC |T696_U21275_M0 | | |smprd02.niladv.org |05:18:21| |
|norm|1 | | |
0|
|SYNC_RFC |T697_U21276_M0 | | |smprd02.niladv.org |05:18:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T698_U21280_M0 | | |10.54.36.13 |05:18:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T699_U21281_M0 | | |10.54.36.37 |05:18:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T700_U21282_M0 | | |10.54.36.26 |05:18:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T701_U21283_M0 | | |10.54.36.11 |05:18:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T702_U21285_M0 | | |10.54.36.14 |05:18:35| |
|norm|1 | | |
0|

Found 703 logons with 703 sessions


Total ES (gross) memory of all sessions: 83 MB
Most ES (gross) memory allocated by T109_U5012_M1: 12 MB

Force ABAP stack dump of session T12_U20220_M0


Force ABAP stack dump of session T23_U20175_M0
Force ABAP stack dump of session T38_U20216_M0
Force ABAP stack dump of session T61_U20113_M0

Sun Sep 22 05:18:43:886 2019


Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T121_U20226_M0
Force ABAP stack dump of session T190_U20350_M0
Force ABAP stack dump of session T197_U20347_M0
Force ABAP stack dump of session T200_U20354_M0
Force ABAP stack dump of session T251_U20559_M0
Force ABAP stack dump of session T253_U20470_M0
Force ABAP stack dump of session T255_U20475_M0
Force ABAP stack dump of session T260_U20481_M0
Force ABAP stack dump of session T288_U20578_M0
Force ABAP stack dump of session T313_U20596_M0
Force ABAP stack dump of session T315_U20602_M0
Force ABAP stack dump of session T323_U20621_M0
Force ABAP stack dump of session T342_U20674_M0
Force ABAP stack dump of session T364_U20713_M0
Force ABAP stack dump of session T404_U20779_M0
Force ABAP stack dump of session T426_U20813_M0
Force ABAP stack dump of session T458_U20872_M0
Force ABAP stack dump of session T461_U20878_M0
Force ABAP stack dump of session T497_U20937_M0
Force ABAP stack dump of session T513_U20966_M0
Force ABAP stack dump of session T558_U21031_M0
Force ABAP stack dump of session T643_U21185_M0

RFC-Connection Table (241 entries) Sun Sep 22 05:18:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 2|54277723|54277723SU21134_M0 |T615_U21134_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 4|53822060|53822060SU20926_M0 |T492_U20926_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 5|53882124|53882124SU20949_M0 |T504_U20949_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 6|54577699|54577699SU21273_M0 |T694_U21273_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 7|54205118|54205118SU21101_M0 |T593_U21101_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 10|54372533|54372533SU21171_M0 |T636_U21171_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 11|53813726|53813726SU20922_M0 |T489_U20922_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 12|53370125|53370125SU20716_M0 |T352_U20716_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 13|53891128|53891128SU20952_M0 |T507_U20952_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 16|53750985|53750985SU20896_M0 |T473_U20896_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 19|54366552|54366552SU21169_M0 |T634_U21169_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 20|53819053|53819053SU20924_M0 |T491_U20924_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 21|53677905|53677905SU20861_M0 |T451_U20861_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 23|53812054|53812054SU20921_M0 |T488_U20921_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 24|54369536|54369536SU21170_M0 |T635_U21170_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 25|53941255|53941255SU20976_M0 |T520_U20976_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 26|54305161|54305161SU20692_M0 |T334_U20692_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 27|54586777|54586777SU21276_M0 |T697_U21276_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 28|53871130|53871130SU20946_M0 |T502_U20946_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 29|54554387|54554387SU21260_M0 |T685_U21260_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 30|54448632|54448632SU21205_M0 |T658_U21205_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 31|53533790|53533790SU20795_M0 |T415_U20795_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 32|53831465|53831465SU20509_M0 |T274_U20509_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 33|54303059|54303059SU20690_M0 |T341_U20690_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 34|53477705|53477705SU20764_M0 |T396_U20764_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 35|53807465|53807465SU20918_M0 |T486_U20918_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 37|53606863|53606863SU20824_M0 |T434_U20824_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 38|53461465|53461465SU20758_M0 |T391_U20758_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 41|54397087|54397087SU21180_M0 |T641_U21180_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 42|53536792|53536792SU20796_M0 |T416_U20796_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 43|53659094|53659094SU20854_M0 |T440_U20854_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 44|54184460|54184460SU20641_M0 |T331_U20641_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 45|53745004|53745004SU20892_M0 |T471_U20892_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 48|54172368|54172368SU21089_M0 |T588_U21089_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 50|53747978|53747978SU20893_M0 |T472_U20893_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 51|54517702|54517702SU21245_M0 |T678_U21245_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 53|53816065|53816065SU20923_M0 |T490_U20923_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 56|53666302|53666302SU20856_M0 |T439_U20856_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 57|53869968|53869968SU20533_M0 |T290_U20533_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 60|53799525|53799525SU20916_M0 |T484_U20916_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 61|54583761|54583761SU21275_M0 |T696_U21275_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 62|54199046|54199046SU21096_M0 |T534_U21096_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 64|54375579|54375579SU21172_M0 |T637_U21172_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 66|54580753|54580753SU21274_M0 |T695_U21274_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 67|54240969|54240969SU20669_M0 |T349_U20669_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 68|53474712|53474712SU20763_M0 |T395_U20763_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 69|53609831|53609831SU20825_M0 |T435_U20825_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 70|54213601|54213601SU21103_M0 |T549_U21103_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 71|53603861|53603861SU20823_M0 |T433_U20823_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 72|53504970|53504970SU20777_M0 |T402_U20777_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 73|53627349|53627349SU20435_M0 |T238_U20435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 75|54293525|54293525SU21142_M0 |T620_U21142_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 76|53740983|53740983SU20890_M0 |T469_U20890_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 79|54378556|54378556SU21173_M0 |T638_U21173_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 80|54405977|54405977SU21183_M0 |T601_U21183_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 81|53539769|53539769SU20797_M0 |T417_U20797_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 82|53542786|53542786SU20798_M0 |T418_U20798_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 83|54031268|54031268SU21015_M0 |T546_U21015_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 84|54055969|54055969SU20600_M0 |T277_U20600_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 85|53753961|53753961SU20897_M0 |T474_U20897_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 86|54508676|54508676SU21242_M0 |T665_U21242_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 87|54290760|54290760SU21141_M0 |T619_U21141_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 89|54274191|54274191SU21133_M0 |T614_U21133_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 90|53335572|53335572SU20704_M0 |T359_U20704_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 91|54574766|54574766SU21272_M0 |T693_U21272_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 92|53952146|53952146SU20984_M0 |T527_U20984_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 93|53933395|53933395SU20967_M0 |T514_U20967_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 95|53964177|53964177SU20988_M0 |T531_U20988_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 96|53683884|53683884SU20864_M0 |T453_U20864_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 98|32252616|32252616SU20586_M0 |T308_U20586_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 99|53683971|53683971SU20456_M0 |T248_U20456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 100|53884618|53884618SU20542_M0 |T279_U20542_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 101|53875871|53875871SU20535_M0 |T281_U20535_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 102|54568649|54568649SU21270_M0 |T692_U21270_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 103|53955198|53955198SU20985_M0 |T528_U20985_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 104|53961202|53961202SU20987_M0 |T530_U20987_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 106|53713330|53713330SU20876_M0 |T384_U20876_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 107|54311482|54311482SU21148_M0 |T625_U21148_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 108|54009269|54009269SU21007_M0 |T539_U21007_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 109|53791976|53791976SU20908_M0 |T479_U20908_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 112|54241442|54241442SU21118_M0 |T606_U21118_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 113|53825033|53825033SU20927_M0 |T493_U20927_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 115|53872183|53872183SU20947_M0 |T503_U20947_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 116|54544123|54544123SU21254_M0 |T411_U21254_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 118|53918034|53918034SU20961_M0 |T438_U20961_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 119|54302460|54302460SU21145_M0 |T622_U21145_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 121|53450513|53450513SU20752_M0 |T388_U20752_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 123|54511679|54511679SU21243_M0 |T676_U21243_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 124|54169405|54169405SU21088_M0 |T587_U21088_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 125|54308505|54308505SU21147_M0 |T624_U21147_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 126|54246530|54246530SU20673_M0 |T339_U20673_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 129|54268124|54268124SU21129_M0 |T557_U21129_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 130|53730608|53730608SU20888_M0 |T468_U20888_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 131|53863051|53863051SU20941_M0 |T500_U20941_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 132|54505692|54505692SU21241_M0 |T664_U21241_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 133|54022221|54022221SU21012_M0 |T543_U21012_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 134|54358834|54358834SU21166_M0 |T632_U21166_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 135|53772320|53772320SU20903_M0 |T478_U20903_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 136|54013979|54013979SU21009_M0 |T541_U21009_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 137|54345802|54345802SU21160_M0 |T629_U21160_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 138|53820547|53820547SU20507_M0 |T273_U20507_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 140|54235438|54235438SU21115_M0 |T604_U21115_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 142|53888093|53888093SU20951_M0 |T506_U20951_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 143|53958159|53958159SU20986_M0 |T529_U20986_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 145|53615810|53615810SU20828_M0 |T437_U20828_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 146|54238481|54238481SU21117_M0 |T605_U21117_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 147|54436627|54436627SU21201_M0 |T654_U21201_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 149|53338570|53338570SU20705_M0 |T360_U20705_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 152|54221690|54221690SU21110_M0 |T600_U21110_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 153|53674932|53674932SU20860_M0 |T450_U20860_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 154|54028234|54028234SU21014_M0 |T545_U21014_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 157|54093287|54093287SU21054_M0 |T565_U21054_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 158|53573044|53573044SU20809_M0 |T293_U20809_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 159|54514686|54514686SU21244_M0 |T677_U21244_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 160|53405636|53405636SU20734_M0 |T378_U20734_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 162|54160337|54160337SU21084_M0 |T584_U21084_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 163|54083666|54083666SU21050_M0 |T562_U21050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 164|54229412|54229412SU21113_M0 |T602_U21113_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 165|53591025|53591025SU20818_M0 |T430_U20818_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 166|53940202|53940202SU20975_M0 |T519_U20975_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 167|54430592|54430592SU21199_M0 |T653_U21199_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 169|53671936|53671936SU20859_M0 |T449_U20859_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 170|53796298|53796298SU20500_M0 |T271_U20500_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 171|53575295|53575295SU20810_M0 |T424_U20810_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 173|54500465|54500465SU21238_M0 |T666_U21238_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 174|54015616|54015616SU21010_M0 |T542_U21010_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 175|53402644|53402644SU20733_M0 |T377_U20733_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 176|54333571|54333571SU20703_M0 |T358_U20703_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 177|54330571|54330571SU20702_M0 |T357_U20702_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 178|53359275|53359275SU20714_M0 |T365_U20714_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 180|53426328|53426328SU20745_M0 |T385_U20745_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 181|54427906|54427906SU21198_M0 |T652_U21198_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 182|54081559|54081559SU21049_M0 |T550_U21049_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 183|54166366|54166366SU21086_M0 |T586_U21086_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 184|54025224|54025224SU21013_M0 |T544_U21013_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 185|54150626|54150626SU21080_M0 |T581_U21080_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 186|54128970|54128970SU21069_M0 |T525_U21069_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 187|53341583|53341583SU20707_M0 |T361_U20707_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 190|53894111|53894111SU20953_M0 |T508_U20953_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 191|54485316|54485316SU21231_M0 |T669_U21231_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 192|54232387|54232387SU21114_M0 |T603_U21114_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 193|53933040|53933040SU20558_M0 |T75_U20558_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 194|53944437|53944437SU20977_M0 |T521_U20977_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 195|53702262|53702262SU20874_M0 |T460_U20874_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 196|54090292|54090292SU21053_M0 |T564_U21053_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 197|54550695|54550695SU21259_M0 |T684_U21259_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 199|53643123|53643123SU20844_M0 |T428_U20844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 200|53612837|53612837SU20827_M0 |T436_U20827_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 201|53399648|53399648SU20732_M0 |T376_U20732_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 202|54305468|54305468SU21146_M0 |T623_U21146_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 204|53438225|53438225SU20748_M0 |T256_U20748_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 205|53746045|53746045SU20474_M0 |T254_U20474_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 206|54442606|54442606SU21203_M0 |T656_U21203_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 207|54096290|54096290SU21055_M0 |T566_U21055_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 208|54034234|54034234SU21016_M0 |T547_U21016_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 209|53580137|53580137SU20812_M0 |T425_U20812_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 212|53519586|53519586SU20784_M0 |T408_U20784_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 213|53545747|53545747SU20799_M0 |T419_U20799_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 214|54476047|54476047SU21228_M0 |T674_U21228_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 216|54141465|54141465SU20630_M0 |T328_U20630_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 220|53885142|53885142SU20950_M0 |T505_U20950_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 221|54118046|54118046SU20620_M0 |T322_U20620_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 225|53848955|53848955SU20936_M0 |T445_U20936_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 227|54012478|54012478SU21008_M0 |T540_U21008_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 231|54439604|54439604SU21202_M0 |T655_U21202_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 232|54299485|54299485SU21144_M0 |T621_U21144_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 234|54107318|54107318SU20615_M0 |T320_U20615_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 235|54099331|54099331SU21056_M0 |T567_U21056_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 236|53598228|53598228SU20821_M0 |T432_U20821_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 238|53988107|53988107SU20997_M0 |T522_U20997_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 239|53468725|53468725SU20761_M0 |T393_U20761_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 242|53742654|53742654SU20891_M0 |T470_U20891_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 244|54142537|54142537SU21075_M0 |T578_U21075_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 248|54414846|54414846SU21187_M0 |T644_U21187_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 250|53800681|53800681SU20917_M0 |T485_U20917_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 251|54155465|54155465SU21082_M0 |T583_U21082_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 252|53938941|53938941SU20562_M0 |T298_U20562_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 253|53408645|53408645SU20736_M0 |T379_U20736_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 254|53680911|53680911SU20863_M0 |T452_U20863_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 255|53734228|53734228SU20469_M0 |T252_U20469_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 256|54445653|54445653SU21204_M0 |T657_U21204_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 257|54163344|54163344SU21085_M0 |T585_U21085_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 259|53465735|53465735SU20760_M0 |T392_U20760_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 261|54467123|54467123SU21224_M0 |T673_U21224_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 263|54045247|54045247SU20597_M0 |T314_U20597_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 265|53471695|53471695SU20762_M0 |T394_U20762_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 267|54153466|54153466SU21081_M0 |T582_U21081_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 271|53396676|53396676SU20731_M0 |T375_U20731_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 274|54102300|54102300SU21057_M0 |T568_U21057_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 277|54062174|54062174SU21035_M0 |T559_U21035_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 282|53511057|53511057SU20782_M0 |T406_U20782_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 284|53729452|53729452SU20887_M0 |T467_U20887_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 241 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
2 INVALID -1
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
361 INVALID -1
362 INVALID -1
363 INVALID -1
364 INVALID -1
365 INVALID -1
366 INVALID -1
367 INVALID -1
368 INVALID -1
369 INVALID -1
370 INVALID -1
371 INVALID -1
372 INVALID -1
373 INVALID -1
374 INVALID -1
375 INVALID -1
376 INVALID -1
377 INVALID -1
378 INVALID -1
379 INVALID -1
380 INVALID -1
381 INVALID -1
382 INVALID -1
383 INVALID -1
384 INVALID -1
385 INVALID -1
386 INVALID -1
387 INVALID -1
388 INVALID -1
389 INVALID -1
390 INVALID -1
391 INVALID -1
392 INVALID -1
393 INVALID -1
394 INVALID -1
395 INVALID -1
396 INVALID -1
397 INVALID -1
398 INVALID -1
399 INVALID -1
400 INVALID -1
401 INVALID -1
402 INVALID -1
403 INVALID -1
404 INVALID -1
405 INVALID -1
406 INVALID -1
407 INVALID -1
408 INVALID -1
409 INVALID -1
410 INVALID -1
411 INVALID -1
412 INVALID -1
413 INVALID -1
414 INVALID -1
415 INVALID -1
416 INVALID -1
417 INVALID -1
418 INVALID -1
419 INVALID -1
420 INVALID -1
421 INVALID -1
422 INVALID -1
423 INVALID -1
424 INVALID -1
425 INVALID -1
426 INVALID -1
427 INVALID -1
428 INVALID -1
429 INVALID -1
... skip next entries
100 ca_blk slots of 6000 in use, 98 currently unowned (in request queues)

MPI Info Sun Sep 22 05:18:43 2019


------------------------------------------------------------
Current pipes in use: 211
Current / maximal blocks in use: 233 / 1884

Periodic Tasks Sun Sep 22 05:18:43 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5718| 77| |
|
| 1|DDLOG | 5718| 77| |
|
| 2|BTCSCHED | 11433| 21| |
|
| 3|RESTART_ALL | 2287| 73| |
|
| 4|ENVCHECK | 34315| 20| |
|
| 5|AUTOABAP | 2287| 73| |
|
| 6|BGRFC_WATCHDOG | 2288| 73| |
|
| 7|AUTOTH | 317| 21| |
|
| 8|AUTOCCMS | 11433| 21| |
|
| 9|AUTOSECURITY | 11432| 21| |
|
| 10|LOAD_CALCULATION | 685353| 1| |
|
| 11|SPOOLALRM | 11438| 21| |
|

Found 12 periodic tasks

********** SERVER SNAPSHOT 187 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:18:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:18:44:476 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:18:59:363 2019


DpHdlSoftCancel: cancel request for T675_U21230_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T671_U21232_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:19:03:858 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:19:04:161 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:19:04:368 2019


DpHdlSoftCancel: cancel request for T670_U21234_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T668_U21233_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:19:04:495 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:19:18:550 2019


DpHdlSoftCancel: cancel request for T667_U21239_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:19:23:859 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:19:24:513 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:19:34:559 2019


DpHdlSoftCancel: cancel request for T679_U21249_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:19:43:859 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 32559

Sun Sep 22 05:19:44:565 2019


DpHdlSoftCancel: cancel request for T680_U21251_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:19:54:813 2019


DpHdlSoftCancel: cancel request for T681_U21255_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
Sun Sep 22 05:19:59:818 2019
DpHdlSoftCancel: cancel request for T686_U21261_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:20:03:860 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-420
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-421
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-422
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-423
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-424

Sun Sep 22 05:20:04:162 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:20:04:561 2019


*** ERROR => DpHdlDeadWp: W0 (pid 420) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=420) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 420)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 421) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=421) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 421)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 422) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=422) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 422)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 423) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=423) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 423)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 424) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=424) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 424)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:20:04:823 2019


DpHdlSoftCancel: cancel request for T690_U21266_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T688_U21263_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T689_U21264_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T687_U21262_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:20:09:828 2019


DpHdlSoftCancel: cancel request for T691_U21269_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:20:22:847 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:20:23:862 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 32559 terminated

Sun Sep 22 05:20:34:853 2019


DpHdlSoftCancel: cancel request for T698_U21280_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T699_U21281_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T700_U21282_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T701_U21283_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:20:39:857 2019


DpHdlSoftCancel: cancel request for T702_U21285_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T639_U21178_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:20:43:862 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:20:44:867 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:20:54:872 2019


DpHdlSoftCancel: cancel request for T705_U21292_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T704_U21290_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T703_U21289_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:20:59:877 2019


DpHdlSoftCancel: cancel request for T708_U21295_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T709_U21296_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:21:03:863 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:21:04:162 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:21:04:618 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:21:04:880 2019


DpHdlSoftCancel: cancel request for T710_U21297_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:21:18:709 2019


DpHdlSoftCancel: cancel request for T711_U21301_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:21:23:863 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:21:24:635 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:21:29:716 2019


DpHdlSoftCancel: cancel request for T640_U21310_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:21:43:864 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:22:03:864 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpCheckTerminals: NiCheck2(rc=-23: NIENO_ANSWER) failed for
T109_U5012 (60 secs)-> disconnecting [dpTerminal.c 1467]
|GUI |T109_U5012_M1 |001|EXT_SCHAITAN|SST-LAP-HP0055 |04:31:07|5 |
SBAL_DELETE |high|1 |
|SA38 |

Sun Sep 22 05:22:04:163 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:22:04:933 2019


DpHdlSoftCancel: cancel request for T718_U21317_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T719_U21318_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:22:09:939 2019


DpHdlSoftCancel: cancel request for T720_U21322_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:22:20:776 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:22:23:864 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:22:34:959 2019


DpHdlSoftCancel: cancel request for T727_U21333_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T728_U21334_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T729_U21335_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T731_U21337_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T730_U21336_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:22:39:959 2019


DpHdlSoftCancel: cancel request for T732_U21339_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:22:40:796 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:22:43:865 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:22:44:960 2019


DpHdlSoftCancel: cancel request for T734_U21341_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:22:48:969 2019


DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.908261 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.908171 /
0.000000

Sun Sep 22 05:22:49:965 2019


DpHdlSoftCancel: cancel request for T736_U21346_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:22:52:973 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 05:22:54:970 2019


DpHdlSoftCancel: cancel request for T738_U21348_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T739_U21349_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:22:59:975 2019


DpHdlSoftCancel: cancel request for T741_U21352_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T742_U21353_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T740_U21351_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:23:03:865 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:23:04:164 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:23:04:981 2019


DpHdlSoftCancel: cancel request for T784_U21417_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T745_U21356_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T743_U21354_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T744_U21355_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T746_U21358_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:23:14:990 2019


DpHdlSoftCancel: cancel request for T747_U21361_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:23:20:828 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:23:23:866 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:23:30:001 2019


DpHdlSoftCancel: cancel request for T526_U21370_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:23:40:005 2019


DpHdlSoftCancel: cancel request for T782_U21415_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:23:40:844 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:23:43:866 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:23:59:577 2019


DpHdlSoftCancel: cancel request for T754_U21377_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T611_U21375_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:24:03:867 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:24:04:165 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:24:05:866 2019


DpHdlSoftCancel: cancel request for T759_U21384_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T758_U21382_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T757_U21380_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:24:20:880 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:24:23:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:19:43 2019, skip new
snapshot

Sun Sep 22 05:24:34:889 2019


DpHdlSoftCancel: cancel request for T779_U21409_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T776_U21405_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T777_U21406_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T775_U21404_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T778_U21407_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:24:39:894 2019


DpHdlSoftCancel: cancel request for T780_U21410_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:24:40:898 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:24:43:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 188 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:24:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 05:24:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 2
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 1
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 05:24:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 2431 (peak 2433, writeCount 24106369, readCount 24103938)


UPD : 0 (peak 31, writeCount 4962, readCount 4962)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125418, readCount 2125418)
SPO : 0 (peak 2, writeCount 25129, readCount 25129)
UP2 : 0 (peak 1, writeCount 2348, readCount 2348)
DISP: 0 (peak 67, writeCount 890378, readCount 890378)
GW : 0 (peak 49, writeCount 22411024, readCount 22411024)
ICM : 1 (peak 186, writeCount 391125, readCount 391124)
LWP : 6 (peak 16, writeCount 38304, readCount 38298)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 798 elements, peak 798):
-1 <- 657 < T626_U21152_M0> -> 809
657 <- 809 < T778_U21407_M0> -> 806
809 <- 806 < T775_U21404_M0> -> 808
806 <- 808 < T777_U21406_M0> -> 807
808 <- 807 < T776_U21405_M0> -> 810
807 <- 810 < T779_U21409_M0> -> 649
810 <- 649 < T618_U21138_M0> -> 648
649 <- 648 < T617_U21136_M0> -> 788
648 <- 788 < T757_U21380_M0> -> 789
788 <- 789 < T758_U21382_M0> -> 790
789 <- 790 < T759_U21384_M0> -> 644
790 <- 644 < T613_U21131_M0> -> 643
644 <- 643 < T612_U21130_M0> -> 647
643 <- 647 < T616_U21135_M0> -> 642
647 <- 642 < T611_U21375_M0> -> 785
642 <- 785 < T754_U21377_M0> -> 641
785 <- 641 < T610_U21125_M0> -> 813
641 <- 813 < T782_U21415_M0> -> 639
813 <- 639 < T608_U21122_M0> -> 638
639 <- 638 < T607_U21121_M0> -> 628
638 <- 628 < T597_U21120_M0> -> 557
628 <- 557 < T526_U21370_M0> -> 630
557 <- 630 < T599_U21109_M0> -> 629
630 <- 629 < T598_U21108_M0> -> 778
629 <- 778 < T747_U21361_M0> -> 626
778 <- 626 < T595_U21104_M0> -> 627
626 <- 627 < T596_U21105_M0> -> 625
627 <- 625 < T594_U21102_M0> -> 777
625 <- 777 < T746_U21358_M0> -> 775
777 <- 775 < T744_U21355_M0> -> 622
775 <- 622 < T591_U21098_M0> -> 608
622 <- 608 < T577_U21097_M0> -> 623
608 <- 623 < T592_U21099_M0> -> 774
623 <- 774 < T743_U21354_M0> -> 776
774 <- 776 < T745_U21356_M0> -> 815
776 <- 815 < T784_U21417_M0> -> 771
815 <- 771 < T740_U21351_M0> -> 773
771 <- 773 < T742_U21353_M0> -> 772
773 <- 772 < T741_U21352_M0> -> 770
772 <- 770 < T739_U21349_M0> -> 769
770 <- 769 < T738_U21348_M0> -> 767
769 <- 767 < T736_U21346_M0> -> 621
767 <- 621 < T590_U21094_M0> -> 765
621 <- 765 < T734_U21341_M0> -> 763
765 <- 763 < T732_U21339_M0> -> 620
763 <- 620 < T589_U21092_M0> -> 761
620 <- 761 < T730_U21336_M0> -> 762
761 <- 762 < T731_U21337_M0> -> 760
762 <- 760 < T729_U21335_M0> -> 759
760 <- 759 < T728_U21334_M0> -> 758
759 <- 758 < T727_U21333_M0> -> 611
758 <- 611 < T580_U21079_M0> -> 610
611 <- 610 < T579_U21076_M0> -> 751
610 <- 751 < T720_U21322_M0> -> 607
751 <- 607 < T576_U21073_M0> -> 606
607 <- 606 < T575_U21072_M0> -> 750
606 <- 750 < T719_U21318_M0> -> 749
750 <- 749 < T718_U21317_M0> -> 555
749 <- 555 < T524_U21071_M0> -> 605
555 <- 605 < T574_U21067_M0> -> 604
605 <- 604 < T573_U21066_M0> -> 602
604 <- 602 < T571_U21063_M0> -> 601
602 <- 601 < T570_U21062_M0> -> 600
601 <- 600 < T569_U21061_M0> -> 671
600 <- 671 < T640_U21310_M0> -> 742
671 <- 742 < T711_U21301_M0> -> 585
742 <- 585 < T556_U21043_M0> -> 586
585 <- 586 < T555_U21042_M0> -> 584
586 <- 584 < T553_U21041_M0> -> 583
584 <- 583 < T552_U21048_M0> -> 582
583 <- 582 < T551_U21047_M0> -> 587
582 <- 587 < T554_U21040_M0> -> 741
587 <- 741 < T710_U21297_M0> -> 592
741 <- 592 < T561_U21037_M0> -> 591
592 <- 591 < T560_U21036_M0> -> 740
591 <- 740 < T709_U21296_M0> -> 739
740 <- 739 < T708_U21295_M0> -> 734
739 <- 734 < T703_U21289_M0> -> 735
734 <- 735 < T704_U21290_M0> -> 736
735 <- 736 < T705_U21292_M0> -> 589
736 <- 589 < T558_U21031_M0> -> 670
589 <- 670 < T639_U21178_M0> -> 733
670 <- 733 < T702_U21285_M0> -> 579
733 <- 579 < T548_U21020_M0> -> 732
579 <- 732 < T701_U21283_M0> -> 731
732 <- 731 < T700_U21282_M0> -> 730
731 <- 730 < T699_U21281_M0> -> 729
730 <- 729 < T698_U21280_M0> -> 568
729 <- 568 < T537_U21002_M0> -> 569
568 <- 569 < T538_U21004_M0> -> 722
569 <- 722 < T691_U21269_M0> -> 567
722 <- 567 < T536_U21001_M0> -> 718
567 <- 718 < T687_U21262_M0> -> 720
718 <- 720 < T689_U21264_M0> -> 719
720 <- 719 < T688_U21263_M0> -> 721
719 <- 721 < T690_U21266_M0> -> 566
721 <- 566 < T535_U21000_M0> -> 717
566 <- 717 < T686_U21261_M0> -> 553
717 <- 553 < T523_U20998_M0> -> 712
553 <- 712 < T681_U21255_M0> -> 711
712 <- 711 < T680_U21251_M0> -> 564
711 <- 564 < T533_U20994_M0> -> 563
564 <- 563 < T532_U20992_M0> -> 239
563 <- 239 < T208_U20991_M0> -> 377
239 <- 377 < T346_U20990_M0> -> 710
377 <- 710 < T679_U21249_M0> -> 697
710 <- 697 < T667_U21239_M0> -> 544
697 <- 544 < T513_U20966_M0> -> 548
544 <- 548 < T517_U20971_M0> -> 546
548 <- 546 < T515_U20968_M0> -> 547
546 <- 547 < T516_U20969_M0> -> 549
547 <- 549 < T518_U20974_M0> -> 700
549 <- 700 < T668_U21233_M0> -> 702
700 <- 702 < T670_U21234_M0> -> 543
702 <- 543 < T512_U20965_M0> -> 698
543 <- 698 < T671_U21232_M0> -> 706
698 <- 706 < T675_U21230_M0> -> 542
706 <- 542 < T511_U20963_M0> -> 541
542 <- 541 < T510_U20962_M0> -> 540
541 <- 540 < T509_U20958_M0> -> 674
540 <- 674 < T643_U21185_M0> -> 528
674 <- 528 < T497_U20937_M0> -> 492
528 <- 492 < T461_U20878_M0> -> 489
492 <- 489 < T458_U20872_M0> -> 457
489 <- 457 < T426_U20813_M0> -> 434
457 <- 434 < T404_U20779_M0> -> 395
434 <- 395 < T364_U20713_M0> -> 371
395 <- 371 < T342_U20674_M0> -> 354
371 <- 354 < T323_U20621_M0> -> 346
354 <- 346 < T315_U20602_M0> -> 344
346 <- 344 < T313_U20596_M0> -> 319
344 <- 319 < T288_U20578_M0> -> 291
319 <- 291 < T260_U20481_M0> -> 286
291 <- 286 < T255_U20475_M0> -> 284
286 <- 284 < T253_U20470_M0> -> 282
284 <- 282 < T251_U20559_M0> -> 231
282 <- 231 < T200_U20354_M0> -> 228
231 <- 228 < T197_U20347_M0> -> 221
228 <- 221 < T190_U20350_M0> -> 62
221 <- 62 < T121_U20226_M0> -> 78
62 <- 78 < T88_U20160_M0> -> 69
78 <- 69 < T61_U20113_M0> -> 100
69 <- 100 < T38_U20216_M0> -> 54
100 <- 54 < T23_U20175_M0> -> 89
54 <- 89 < T12_U20220_M0> -> 703
89 <- 703 < T672_U21223_M0> -> 713
703 <- 713 < T682_U21256_M0> -> 692
713 <- 692 < T661_U21211_M0> -> 694
692 <- 694 < T663_U21213_M0> -> 693
694 <- 693 < T662_U21212_M0> -> 691
693 <- 691 < T660_U21210_M0> -> 690
691 <- 690 < T659_U21209_M0> -> 532
690 <- 532 < T501_U20942_M0> -> 682
532 <- 682 < T651_U21197_M0> -> 714
682 <- 714 < T683_U21257_M0> -> 681
714 <- 681 < T650_U21193_M0> -> 680
681 <- 680 < T649_U21192_M0> -> 679
680 <- 679 < T648_U21191_M0> -> 678
679 <- 678 < T647_U21190_M0> -> 677
678 <- 677 < T646_U21189_M0> -> 676
677 <- 676 < T645_U21188_M0> -> 530
676 <- 530 < T499_U20939_M0> -> 529
530 <- 529 < T498_U20938_M0> -> 673
529 <- 673 < T642_U21184_M0> -> 527
673 <- 527 < T496_U20933_M0> -> 526
527 <- 526 < T495_U20931_M0> -> 514
526 <- 514 < T483_U20929_M0> -> 525
514 <- 525 < T494_U20930_M0> -> 453
525 <- 453 < T422_U21176_M0> -> 518
453 <- 518 < T487_U20919_M0> -> 664
518 <- 664 < T633_U21167_M0> -> 478
664 <- 478 < T447_U20910_M0> -> 513
478 <- 513 < T482_U20914_M0> -> 512
513 <- 512 < T481_U20912_M0> -> 495
512 <- 495 < T464_U20911_M0> -> 331
495 <- 331 < T300_U20907_M0> -> 511
331 <- 511 < T480_U20909_M0> -> 662
511 <- 662 < T631_U21162_M0> -> 661
662 <- 661 < T630_U21161_M0> -> 659
661 <- 659 < T628_U21159_M0> -> 658
659 <- 658 < T627_U21157_M0> -> 508
658 <- 508 < T477_U20902_M0> -> 507
508 <- 507 < T476_U20901_M0> -> 506
507 <- 506 < T475_U20899_M0> -> 497
506 <- 497 < T466_U20886_M0> -> 496
497 <- 496 < T465_U20884_M0> -> 494
496 <- 494 < T463_U20881_M0> -> 493
494 <- 493 < T462_U20880_M0> -> 429
493 <- 429 < T398_U20877_M0> -> 490
429 <- 490 < T459_U20873_M0> -> 488
490 <- 488 < T457_U20870_M0> -> 487
488 <- 487 < T456_U20869_M0> -> 486
487 <- 486 < T455_U20868_M0> -> 485
486 <- 485 < T454_U20867_M0> -> 471
485 <- 471 < T444_U20853_M0> -> 474
471 <- 474 < T443_U20850_M0> -> 473
474 <- 473 < T442_U20851_M0> -> 476
473 <- 476 < T441_U20847_M0> -> 479
476 <- 479 < T448_U20845_M0> -> 477
479 <- 477 < T446_U20840_M0> -> 462
477 <- 462 < T431_U20819_M0> -> 460
462 <- 460 < T429_U20817_M0> -> 458
460 <- 458 < T427_U20814_M0> -> 454
458 <- 454 < T423_U20807_M0> -> 452
454 <- 452 < T421_U20804_M0> -> 451
452 <- 451 < T420_U20803_M0> -> 301
451 <- 301 < T270_U20802_M0> -> 298
301 <- 298 < T267_U20801_M0> -> 445
298 <- 445 < T414_U20793_M0> -> 444
445 <- 444 < T413_U20790_M0> -> 443
444 <- 443 < T412_U20789_M0> -> 441
443 <- 441 < T410_U20786_M0> -> 440
441 <- 440 < T409_U20785_M0> -> 438
440 <- 438 < T407_U20783_M0> -> 436
438 <- 436 < T405_U20781_M0> -> 431
436 <- 431 < T400_U20775_M0> -> 435
431 <- 435 < T403_U20778_M0> -> 432
435 <- 432 < T401_U20776_M0> -> 430
432 <- 430 < T399_U20774_M0> -> 428
430 <- 428 < T397_U20769_M0> -> 421
428 <- 421 < T390_U20755_M0> -> 418
421 <- 418 < T387_U20751_M0> -> 420
418 <- 420 < T389_U20754_M0> -> 417
420 <- 417 < T386_U20750_M0> -> 414
417 <- 414 < T383_U20742_M0> -> 413
414 <- 413 < T382_U20741_M0> -> 412
413 <- 412 < T381_U20740_M0> -> 411
412 <- 411 < T380_U20739_M0> -> 404
411 <- 404 < T373_U20738_M0> -> 405
404 <- 405 < T374_U20729_M0> -> 402
405 <- 402 < T371_U20725_M0> -> 401
402 <- 401 < T370_U20724_M0> -> 403
401 <- 403 < T372_U20727_M0> -> 379
403 <- 379 < T348_U20723_M0> -> 398
379 <- 398 < T367_U20720_M0> -> 400
398 <- 400 < T369_U20722_M0> -> 399
400 <- 399 < T368_U20721_M0> -> 369
399 <- 369 < T338_U20717_M0> -> 397
369 <- 397 < T366_U20718_M0> -> 393
397 <- 393 < T362_U20711_M0> -> 394
393 <- 394 < T363_U20712_M0> -> 387
394 <- 387 < T356_U20700_M0> -> 386
387 <- 386 < T355_U20698_M0> -> 385
386 <- 385 < T354_U20697_M0> -> 384
385 <- 384 < T353_U20694_M0> -> 372
384 <- 372 < T343_U20691_M0> -> 245
372 <- 245 < T213_U20687_M0> -> 373
245 <- 373 < T345_U20679_M0> -> 374
373 <- 374 < T340_U20677_M0> -> 375
374 <- 375 < T344_U20676_M0> -> 382
375 <- 382 < T351_U20671_M0> -> 381
382 <- 381 < T350_U20670_M0> -> 378
381 <- 378 < T347_U20664_M0> -> 368
378 <- 368 < T337_U20653_M0> -> 367
368 <- 367 < T336_U20648_M0> -> 366
367 <- 366 < T335_U20645_M0> -> 364
366 <- 364 < T333_U20643_M0> -> 363
364 <- 363 < T332_U20642_M0> -> 361
363 <- 361 < T330_U20639_M0> -> 248
361 <- 248 < T214_U20638_M0> -> 327
248 <- 327 < T296_U20636_M0> -> 360
327 <- 360 < T329_U20632_M0> -> 357
360 <- 357 < T326_U20626_M0> -> 358
357 <- 358 < T327_U20627_M0> -> 356
358 <- 356 < T325_U20624_M0> -> 355
356 <- 355 < T324_U20622_M0> -> 222
355 <- 222 < T191_U20619_M0> -> 352
222 <- 352 < T321_U20616_M0> -> 350
352 <- 350 < T319_U20613_M0> -> 348
350 <- 348 < T317_U20605_M0> -> 347
348 <- 347 < T316_U20603_M0> -> 349
347 <- 349 < T318_U20606_M0> -> 343
349 <- 343 < T312_U20595_M0> -> 342
343 <- 342 < T311_U20594_M0> -> 341
342 <- 341 < T310_U20592_M0> -> 340
341 <- 340 < T309_U20587_M0> -> 338
340 <- 338 < T307_U20585_M0> -> 318
338 <- 318 < T287_U20582_M0> -> 337
318 <- 337 < T306_U20581_M0> -> 336
337 <- 336 < T305_U20580_M0> -> 334
336 <- 334 < T303_U20574_M0> -> 335
334 <- 335 < T304_U20575_M0> -> 333
335 <- 333 < T302_U20568_M0> -> 332
333 <- 332 < T301_U20567_M0> -> 330
332 <- 330 < T299_U20563_M0> -> 328
330 <- 328 < T297_U20560_M0> -> 325
328 <- 325 < T294_U20553_M0> -> 326
325 <- 326 < T295_U20554_M0> -> 309
326 <- 309 < T280_U20543_M0> -> 322
309 <- 322 < T291_U20546_M0> -> 312
322 <- 312 < T283_U20541_M0> -> 310
312 <- 310 < T282_U20544_M0> -> 323
310 <- 323 < T292_U20547_M0> -> 308
323 <- 308 < T276_U20537_M0> -> 314
308 <- 314 < T278_U20536_M0> -> 277
314 <- 277 < T246_U20530_M0> -> 320
277 <- 320 < T289_U20532_M0> -> 317
320 <- 317 < T286_U20526_M0> -> 316
317 <- 316 < T285_U20525_M0> -> 315
316 <- 315 < T284_U20523_M0> -> 306
315 <- 306 < T275_U20514_M0> -> 276
306 <- 276 < T245_U20504_M0> -> 303
276 <- 303 < T272_U20501_M0> -> 300
303 <- 300 < T269_U20497_M0> -> 299
300 <- 299 < T268_U20492_M0> -> 297
299 <- 297 < T266_U20489_M0> -> 296
297 <- 296 < T265_U20488_M0> -> 292
296 <- 292 < T261_U20483_M0> -> 294
292 <- 294 < T263_U20485_M0> -> 295
294 <- 295 < T264_U20486_M0> -> 293
295 <- 293 < T262_U20484_M0> -> 290
293 <- 290 < T259_U20480_M0> -> 289
290 <- 289 < T258_U20479_M0> -> 288
289 <- 288 < T257_U20478_M0> -> 281
288 <- 281 < T250_U20467_M0> -> 274
281 <- 274 < T243_U20466_M0> -> 275
274 <- 275 < T244_U20464_M0> -> 280
275 <- 280 < T249_U20458_M0> -> 271
280 <- 271 < T240_U20455_M0> -> 135
271 <- 135 < T71_U20451_M0> -> 278
135 <- 278 < T247_U20447_M0> -> 273
278 <- 273 < T242_U20441_M0> -> 272
273 <- 272 < T241_U20440_M0> -> 241
272 <- 241 < T210_U20438_M0> -> 270
241 <- 270 < T239_U20436_M0> -> 268
270 <- 268 < T237_U20433_M0> -> 258
268 <- 258 < T227_U20432_M0> -> 267
258 <- 267 < T236_U20429_M0> -> 265
267 <- 265 < T234_U20427_M0> -> 266
265 <- 266 < T235_U20428_M0> -> 264
266 <- 264 < T233_U20426_M0> -> 263
264 <- 263 < T232_U20424_M0> -> 262
263 <- 262 < T231_U20419_M0> -> 261
262 <- 261 < T230_U20417_M0> -> 260
261 <- 260 < T229_U20416_M0> -> 229
260 <- 229 < T198_U20411_M0> -> 259
229 <- 259 < T228_U20415_M0> -> 257
259 <- 257 < T226_U20412_M0> -> 119
257 <- 119 < T11_U20406_M0> -> 242
119 <- 242 < T212_U20408_M0> -> 247
242 <- 247 < T211_U20403_M0> -> 246
247 <- 246 < T215_U20400_M0> -> 255
246 <- 255 < T224_U20389_M0> -> 249
255 <- 249 < T216_U20391_M0> -> 253
249 <- 253 < T222_U20386_M0> -> 252
253 <- 252 < T221_U20383_M0> -> 240
252 <- 240 < T209_U20370_M0> -> 250
240 <- 250 < T219_U20381_M0> -> 237
250 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 236
238 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 230
233 <- 230 < T199_U20353_M0> -> 192
230 <- 192 < T161_U20349_M0> -> 226
192 <- 226 < T195_U20345_M0> -> 225
226 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 220
224 <- 220 < T189_U20331_M0> -> 219
220 <- 219 < T188_U20326_M0> -> 218
219 <- 218 < T187_U20325_M0> -> 213
218 <- 213 < T182_U20314_M0> -> 216
213 <- 216 < T185_U20317_M0> -> 212
216 <- 212 < T181_U20313_M0> -> 215
212 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 210
211 <- 210 < T179_U20309_M0> -> 206
210 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 205
208 <- 205 < T174_U20300_M0> -> 203
205 <- 203 < T172_U20298_M0> -> 204
203 <- 204 < T173_U20299_M0> -> 202
204 <- 202 < T171_U20295_M0> -> 201
202 <- 201 < T170_U20291_M0> -> 200
201 <- 200 < T169_U20289_M0> -> 199
200 <- 199 < T168_U20288_M0> -> 196
199 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 103
195 <- 103 < T145_U20282_M0> -> 193
103 <- 193 < T162_U20279_M0> -> 91
193 <- 91 < T19_U20278_M0> -> 189
91 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 182
191 <- 182 < T151_U20265_M0> -> 176
182 <- 176 < T157_U20264_M0> -> 190
176 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 184
158 <- 184 < T135_U20255_M0> -> 180
184 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 188
154 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 160
131 <- 160 < T148_U20238_M0> -> 175
160 <- 175 < T144_U20241_M0> -> 183
175 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 146
152 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 76
166 <- 76 < T20_U20221_M0> -> 138
76 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 56
74 <- 56 < T35_U20203_M0> -> 59
56 <- 59 < T25_U20202_M0> -> 45
59 <- 45 < T76_U20199_M0> -> 39
45 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 148
171 <- 148 < T123_U20185_M0> -> 153
148 <- 153 < T129_U20184_M0> -> 147
153 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 129
170 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 58
144 <- 58 < T77_U20172_M0> -> 117
58 <- 117 < T31_U20168_M0> -> 60
117 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 163
124 <- 163 < T133_U20159_M0> -> 70
163 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 118
114 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 125
105 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 94
42 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 93
167 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 31
132 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 92
43 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 32
113 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 79
109 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 50
73 <- 50 < T80_U19969_M0> -> 120
50 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 251
227 <- 251 < T220_U20382_M0> -> 254
251 <- 254 < T223_U20387_M0> -> 244
254 <- 244 < T218_U20394_M0> -> 243
244 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 269
256 <- 269 < T238_U20435_M0> -> 279
269 <- 279 < T248_U20456_M0> -> 283
279 <- 283 < T252_U20469_M0> -> 285
283 <- 285 < T254_U20474_M0> -> 302
285 <- 302 < T271_U20500_M0> -> 304
302 <- 304 < T273_U20507_M0> -> 305
304 <- 305 < T274_U20509_M0> -> 321
305 <- 321 < T290_U20533_M0> -> 313
321 <- 313 < T281_U20535_M0> -> 311
313 <- 311 < T279_U20542_M0> -> 165
311 <- 165 < T75_U20558_M0> -> 329
165 <- 329 < T298_U20562_M0> -> 339
329 <- 339 < T308_U20586_M0> -> 345
339 <- 345 < T314_U20597_M0> -> 307
345 <- 307 < T277_U20600_M0> -> 351
307 <- 351 < T320_U20615_M0> -> 353
351 <- 353 < T322_U20620_M0> -> 359
353 <- 359 < T328_U20630_M0> -> 362
359 <- 362 < T331_U20641_M0> -> 380
362 <- 380 < T349_U20669_M0> -> 376
380 <- 376 < T339_U20673_M0> -> 370
376 <- 370 < T341_U20690_M0> -> 365
370 <- 365 < T334_U20692_M0> -> 388
365 <- 388 < T357_U20702_M0> -> 389
388 <- 389 < T358_U20703_M0> -> 390
389 <- 390 < T359_U20704_M0> -> 391
390 <- 391 < T360_U20705_M0> -> 392
391 <- 392 < T361_U20707_M0> -> 396
392 <- 396 < T365_U20714_M0> -> 383
396 <- 383 < T352_U20716_M0> -> 406
383 <- 406 < T375_U20731_M0> -> 407
406 <- 407 < T376_U20732_M0> -> 408
407 <- 408 < T377_U20733_M0> -> 409
408 <- 409 < T378_U20734_M0> -> 410
409 <- 410 < T379_U20736_M0> -> 416
410 <- 416 < T385_U20745_M0> -> 287
416 <- 287 < T256_U20748_M0> -> 419
287 <- 419 < T388_U20752_M0> -> 422
419 <- 422 < T391_U20758_M0> -> 423
422 <- 423 < T392_U20760_M0> -> 424
423 <- 424 < T393_U20761_M0> -> 425
424 <- 425 < T394_U20762_M0> -> 426
425 <- 426 < T395_U20763_M0> -> 427
426 <- 427 < T396_U20764_M0> -> 433
427 <- 433 < T402_U20777_M0> -> 437
433 <- 437 < T406_U20782_M0> -> 439
437 <- 439 < T408_U20784_M0> -> 446
439 <- 446 < T415_U20795_M0> -> 447
446 <- 447 < T416_U20796_M0> -> 448
447 <- 448 < T417_U20797_M0> -> 449
448 <- 449 < T418_U20798_M0> -> 450
449 <- 450 < T419_U20799_M0> -> 324
450 <- 324 < T293_U20809_M0> -> 455
324 <- 455 < T424_U20810_M0> -> 456
455 <- 456 < T425_U20812_M0> -> 461
456 <- 461 < T430_U20818_M0> -> 463
461 <- 463 < T432_U20821_M0> -> 464
463 <- 464 < T433_U20823_M0> -> 465
464 <- 465 < T434_U20824_M0> -> 466
465 <- 466 < T435_U20825_M0> -> 467
466 <- 467 < T436_U20827_M0> -> 468
467 <- 468 < T437_U20828_M0> -> 459
468 <- 459 < T428_U20844_M0> -> 472
459 <- 472 < T440_U20854_M0> -> 470
472 <- 470 < T439_U20856_M0> -> 480
470 <- 480 < T449_U20859_M0> -> 481
480 <- 481 < T450_U20860_M0> -> 482
481 <- 482 < T451_U20861_M0> -> 483
482 <- 483 < T452_U20863_M0> -> 484
483 <- 484 < T453_U20864_M0> -> 491
484 <- 491 < T460_U20874_M0> -> 415
491 <- 415 < T384_U20876_M0> -> 498
415 <- 498 < T467_U20887_M0> -> 499
498 <- 499 < T468_U20888_M0> -> 500
499 <- 500 < T469_U20890_M0> -> 501
500 <- 501 < T470_U20891_M0> -> 502
501 <- 502 < T471_U20892_M0> -> 503
502 <- 503 < T472_U20893_M0> -> 504
503 <- 504 < T473_U20896_M0> -> 505
504 <- 505 < T474_U20897_M0> -> 509
505 <- 509 < T478_U20903_M0> -> 510
509 <- 510 < T479_U20908_M0> -> 515
510 <- 515 < T484_U20916_M0> -> 516
515 <- 516 < T485_U20917_M0> -> 517
516 <- 517 < T486_U20918_M0> -> 519
517 <- 519 < T488_U20921_M0> -> 520
519 <- 520 < T489_U20922_M0> -> 521
520 <- 521 < T490_U20923_M0> -> 522
521 <- 522 < T491_U20924_M0> -> 523
522 <- 523 < T492_U20926_M0> -> 524
523 <- 524 < T493_U20927_M0> -> 475
524 <- 475 < T445_U20936_M0> -> 531
475 <- 531 < T500_U20941_M0> -> 533
531 <- 533 < T502_U20946_M0> -> 534
533 <- 534 < T503_U20947_M0> -> 535
534 <- 535 < T504_U20949_M0> -> 536
535 <- 536 < T505_U20950_M0> -> 537
536 <- 537 < T506_U20951_M0> -> 538
537 <- 538 < T507_U20952_M0> -> 539
538 <- 539 < T508_U20953_M0> -> 469
539 <- 469 < T438_U20961_M0> -> 545
469 <- 545 < T514_U20967_M0> -> 550
545 <- 550 < T519_U20975_M0> -> 551
550 <- 551 < T520_U20976_M0> -> 552
551 <- 552 < T521_U20977_M0> -> 558
552 <- 558 < T527_U20984_M0> -> 559
558 <- 559 < T528_U20985_M0> -> 560
559 <- 560 < T529_U20986_M0> -> 561
560 <- 561 < T530_U20987_M0> -> 562
561 <- 562 < T531_U20988_M0> -> 554
562 <- 554 < T522_U20997_M0> -> 570
554 <- 570 < T539_U21007_M0> -> 571
570 <- 571 < T540_U21008_M0> -> 572
571 <- 572 < T541_U21009_M0> -> 573
572 <- 573 < T542_U21010_M0> -> 574
573 <- 574 < T543_U21012_M0> -> 575
574 <- 575 < T544_U21013_M0> -> 576
575 <- 576 < T545_U21014_M0> -> 577
576 <- 577 < T546_U21015_M0> -> 578
577 <- 578 < T547_U21016_M0> -> 590
578 <- 590 < T559_U21035_M0> -> 581
590 <- 581 < T550_U21049_M0> -> 593
581 <- 593 < T562_U21050_M0> -> 595
593 <- 595 < T564_U21053_M0> -> 596
595 <- 596 < T565_U21054_M0> -> 597
596 <- 597 < T566_U21055_M0> -> 598
597 <- 598 < T567_U21056_M0> -> 599
598 <- 599 < T568_U21057_M0> -> 556
599 <- 556 < T525_U21069_M0> -> 609
556 <- 609 < T578_U21075_M0> -> 612
609 <- 612 < T581_U21080_M0> -> 613
612 <- 613 < T582_U21081_M0> -> 614
613 <- 614 < T583_U21082_M0> -> 615
614 <- 615 < T584_U21084_M0> -> 616
615 <- 616 < T585_U21085_M0> -> 617
616 <- 617 < T586_U21086_M0> -> 618
617 <- 618 < T587_U21088_M0> -> 619
618 <- 619 < T588_U21089_M0> -> 565
619 <- 565 < T534_U21096_M0> -> 624
565 <- 624 < T593_U21101_M0> -> 580
624 <- 580 < T549_U21103_M0> -> 631
580 <- 631 < T600_U21110_M0> -> 633
631 <- 633 < T602_U21113_M0> -> 634
633 <- 634 < T603_U21114_M0> -> 635
634 <- 635 < T604_U21115_M0> -> 636
635 <- 636 < T605_U21117_M0> -> 637
636 <- 637 < T606_U21118_M0> -> 588
637 <- 588 < T557_U21129_M0> -> 645
588 <- 645 < T614_U21133_M0> -> 646
645 <- 646 < T615_U21134_M0> -> 650
646 <- 650 < T619_U21141_M0> -> 651
650 <- 651 < T620_U21142_M0> -> 652
651 <- 652 < T621_U21144_M0> -> 653
652 <- 653 < T622_U21145_M0> -> 654
653 <- 654 < T623_U21146_M0> -> 655
654 <- 655 < T624_U21147_M0> -> 656
655 <- 656 < T625_U21148_M0> -> 660
656 <- 660 < T629_U21160_M0> -> 663
660 <- 663 < T632_U21166_M0> -> 665
663 <- 665 < T634_U21169_M0> -> 666
665 <- 666 < T635_U21170_M0> -> 667
666 <- 667 < T636_U21171_M0> -> 668
667 <- 668 < T637_U21172_M0> -> 669
668 <- 669 < T638_U21173_M0> -> 672
669 <- 672 < T641_U21180_M0> -> 632
672 <- 632 < T601_U21183_M0> -> 675
632 <- 675 < T644_U21187_M0> -> 683
675 <- 683 < T652_U21198_M0> -> 684
683 <- 684 < T653_U21199_M0> -> 685
684 <- 685 < T654_U21201_M0> -> 686
685 <- 686 < T655_U21202_M0> -> 687
686 <- 687 < T656_U21203_M0> -> 688
687 <- 688 < T657_U21204_M0> -> 689
688 <- 689 < T658_U21205_M0> -> 704
689 <- 704 < T673_U21224_M0> -> 705
704 <- 705 < T674_U21228_M0> -> 701
705 <- 701 < T669_U21231_M0> -> 699
701 <- 699 < T666_U21238_M0> -> 695
699 <- 695 < T664_U21241_M0> -> 696
695 <- 696 < T665_U21242_M0> -> 707
696 <- 707 < T676_U21243_M0> -> 708
707 <- 708 < T677_U21244_M0> -> 709
708 <- 709 < T678_U21245_M0> -> 442
709 <- 442 < T411_U21254_M0> -> 715
442 <- 715 < T684_U21259_M0> -> 716
715 <- 716 < T685_U21260_M0> -> 723
716 <- 723 < T692_U21270_M0> -> 724
723 <- 724 < T693_U21272_M0> -> 725
724 <- 725 < T694_U21273_M0> -> 726
725 <- 726 < T695_U21274_M0> -> 727
726 <- 727 < T696_U21275_M0> -> 728
727 <- 728 < T697_U21276_M0> -> 737
728 <- 737 < T706_U21293_M0> -> 738
737 <- 738 < T707_U21294_M0> -> 743
738 <- 743 < T712_U21303_M0> -> 744
743 <- 744 < T713_U21304_M0> -> 745
744 <- 745 < T714_U21305_M0> -> 746
745 <- 746 < T715_U21306_M0> -> 747
746 <- 747 < T716_U21308_M0> -> 748
747 <- 748 < T717_U21315_M0> -> 752
748 <- 752 < T721_U21323_M0> -> 753
752 <- 753 < T722_U21325_M0> -> 754
753 <- 754 < T723_U21326_M0> -> 755
754 <- 755 < T724_U21327_M0> -> 756
755 <- 756 < T725_U21328_M0> -> 757
756 <- 757 < T726_U21330_M0> -> 764
757 <- 764 < T733_U21340_M0> -> 766
764 <- 766 < T735_U21342_M0> -> 768
766 <- 768 < T737_U21347_M0> -> 779
768 <- 779 < T748_U21363_M0> -> 780
779 <- 780 < T749_U21364_M0> -> 781
780 <- 781 < T750_U21365_M0> -> 782
781 <- 782 < T751_U21366_M0> -> 783
782 <- 783 < T752_U21368_M0> -> 784
783 <- 784 < T753_U21372_M0> -> 594
784 <- 594 < T563_U21374_M0> -> 787
594 <- 787 < T756_U21379_M0> -> 791
787 <- 791 < T760_U21385_M0> -> 792
791 <- 792 < T761_U21386_M0> -> 793
792 <- 793 < T762_U21388_M0> -> 794
793 <- 794 < T763_U21389_M0> -> 795
794 <- 795 < T764_U21390_M0> -> 796
795 <- 796 < T765_U21392_M0> -> 797
796 <- 797 < T766_U21394_M0> -> 812
797 <- 812 < T781_U21411_M0> -> 814
812 <- 814 < T783_U21416_M0> -> 802
814 <- 802 < T772_U21419_M0> -> 799
802 <- 799 < T770_U21420_M0> -> 804
799 <- 804 < T771_U21421_M0> -> 805
804 <- 805 < T773_U21423_M0> -> 803
805 <- 803 < T768_U21424_M0> -> 800
803 <- 800 < T769_U21426_M0> -> 801
800 <- 801 < T774_U21428_M0> -> 798
801 <- 798 < T767_U21429_M0> -> 816
798 <- 816 < T785_U21431_M0> -> 817
816 <- 817 < T786_U21432_M0> -> 818
817 <- 818 < T787_U21433_M0> -> 819
818 <- 819 < T788_U21435_M0> -> 820
819 <- 820 < T789_U21437_M0> -> 821
820 <- 821 < T790_U21439_M0> -> 603
821 <- 603 < T572_U21442_M0> -> 786
603 <- 786 < T755_U21443_M0> -> 822
786 <- 822 < T791_U21444_M0> -> 823
822 <- 823 < T792_U21445_M0> -> 824
823 <- 824 < T793_U21447_M0> -> 825
824 <- 825 < T794_U21448_M0> -> 827
825 <- 827 < T796_U21451_M0> -> 828
827 <- 828 < T797_U21452_M0> -> 829
828 <- 829 < T798_U21453_M0> -> 830
829 <- 830 < T799_U21455_M0> -> 831
830 <- 831 < T800_U21456_M0> -> 832
831 <- 832 < T801_U21458_M0> -> 833
832 <- 833 < T802_U21459_M0> -> 834
833 <- 834 < T803_U21460_M0> -> 835
834 <- 835 < T804_U21461_M0> -> 836
835 <- 836 < T805_U21464_M0> -> 837
836 <- 837 < T806_U21466_M0> -> 838
837 <- 838 < T807_U21467_M0> -> 839
838 <- 839 < T808_U21468_M0> -> 840
839 <- 840 < T809_U21469_M0> -> 841
840 <- 841 < T810_U21471_M0> -> 640
841 <- 640 < T609_U21473_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W0> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W1> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W2> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W5> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W6> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <W7> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (2 requests, queue in use):
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (15 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (15 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (16 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (28 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 8 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (15 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (17 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20451_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T156_U20224_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (21 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20619_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (16 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T198_U20411_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T199_U20353_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (21 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20991_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20438_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T212_U20408_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20687_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T215_U20400_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T211_U20403_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T214_U20638_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T216_U20391_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T219_U20381_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T227_U20432_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T228_U20415_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T229_U20416_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T230_U20417_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T231_U20419_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T232_U20424_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T233_U20426_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T234_U20427_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T235_U20428_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T236_U20429_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T237_U20433_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T238_U20435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T239_U20436_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T240_U20455_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T241_U20440_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T242_U20441_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T243_U20466_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T244_U20464_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T245_U20504_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T246_U20530_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T247_U20447_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T248_U20456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T249_U20458_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T250_U20467_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T251_U20559_M0> (20 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T252_U20469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T253_U20470_M0> (15 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T254_U20474_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T255_U20475_M0> (20 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T256_U20748_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T257_U20478_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T258_U20479_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T259_U20480_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T260_U20481_M0> (20 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T261_U20483_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T262_U20484_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T263_U20485_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T264_U20486_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T265_U20488_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T266_U20489_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T267_U20801_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T268_U20492_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T269_U20497_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T270_U20802_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T271_U20500_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T272_U20501_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T273_U20507_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T274_U20509_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T275_U20514_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T277_U20600_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T276_U20537_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T280_U20543_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T282_U20544_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T279_U20542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T283_U20541_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T281_U20535_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T278_U20536_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T284_U20523_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T285_U20525_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T286_U20526_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T287_U20582_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T288_U20578_M0> (13 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T289_U20532_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T290_U20533_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T291_U20546_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T292_U20547_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T293_U20809_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T294_U20553_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T295_U20554_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T296_U20636_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T297_U20560_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T298_U20562_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T299_U20563_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T300_U20907_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T301_U20567_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T302_U20568_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T303_U20574_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T304_U20575_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T305_U20580_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T306_U20581_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T307_U20585_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T308_U20586_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T309_U20587_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T310_U20592_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T311_U20594_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T312_U20595_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T313_U20596_M0> (15 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T314_U20597_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T315_U20602_M0> (19 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T316_U20603_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T317_U20605_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T318_U20606_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T319_U20613_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T320_U20615_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T321_U20616_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T322_U20620_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T323_U20621_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T324_U20622_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T325_U20624_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T326_U20626_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T327_U20627_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T328_U20630_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T329_U20632_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T330_U20639_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T331_U20641_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T332_U20642_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T333_U20643_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T334_U20692_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T335_U20645_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T336_U20648_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T337_U20653_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T338_U20717_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T341_U20690_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T342_U20674_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T343_U20691_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T345_U20679_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T340_U20677_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T344_U20676_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T339_U20673_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T346_U20990_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T347_U20664_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T348_U20723_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T349_U20669_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T350_U20670_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T351_U20671_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T352_U20716_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T353_U20694_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T354_U20697_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T355_U20698_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T356_U20700_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T357_U20702_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T358_U20703_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T359_U20704_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T360_U20705_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T361_U20707_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T362_U20711_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T363_U20712_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T364_U20713_M0> (14 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T365_U20714_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T366_U20718_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T367_U20720_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T368_U20721_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T369_U20722_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T370_U20724_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T371_U20725_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T372_U20727_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T373_U20738_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T374_U20729_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T375_U20731_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T376_U20732_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T377_U20733_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T378_U20734_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T379_U20736_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T380_U20739_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T381_U20740_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T382_U20741_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T383_U20742_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T384_U20876_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T385_U20745_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T386_U20750_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T387_U20751_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T388_U20752_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T389_U20754_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T390_U20755_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T391_U20758_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T392_U20760_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T393_U20761_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T394_U20762_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T395_U20763_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T396_U20764_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T397_U20769_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T398_U20877_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T399_U20774_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T400_U20775_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T401_U20776_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T402_U20777_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T404_U20779_M0> (14 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T403_U20778_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T405_U20781_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T406_U20782_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T407_U20783_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T408_U20784_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T409_U20785_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T410_U20786_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T411_U21254_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T412_U20789_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T413_U20790_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T414_U20793_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T415_U20795_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T416_U20796_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T417_U20797_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T418_U20798_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T419_U20799_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T420_U20803_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T421_U20804_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T422_U21176_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T423_U20807_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T424_U20810_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T425_U20812_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T426_U20813_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T427_U20814_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T428_U20844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T429_U20817_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T430_U20818_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T431_U20819_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T432_U20821_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T433_U20823_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T434_U20824_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T435_U20825_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T436_U20827_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T437_U20828_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T438_U20961_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T439_U20856_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T444_U20853_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T440_U20854_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T442_U20851_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T443_U20850_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T445_U20936_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T441_U20847_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T446_U20840_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T447_U20910_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T448_U20845_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T449_U20859_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T450_U20860_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T451_U20861_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T452_U20863_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T453_U20864_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T454_U20867_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T455_U20868_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T456_U20869_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T457_U20870_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T458_U20872_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T459_U20873_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T460_U20874_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T461_U20878_M0> (20 requests):
- 16 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T462_U20880_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T463_U20881_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T464_U20911_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T465_U20884_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T466_U20886_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T467_U20887_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T468_U20888_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T469_U20890_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T470_U20891_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T471_U20892_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T472_U20893_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T473_U20896_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T474_U20897_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T475_U20899_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T476_U20901_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T477_U20902_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T478_U20903_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T479_U20908_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T480_U20909_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T481_U20912_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T482_U20914_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T483_U20929_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T484_U20916_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T485_U20917_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T486_U20918_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T487_U20919_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T488_U20921_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T489_U20922_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T490_U20923_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T491_U20924_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T492_U20926_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T493_U20927_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T494_U20930_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T495_U20931_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T496_U20933_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T497_U20937_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T498_U20938_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T499_U20939_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T500_U20941_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T501_U20942_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T502_U20946_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T503_U20947_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T504_U20949_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T505_U20950_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T506_U20951_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T507_U20952_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T508_U20953_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T509_U20958_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T510_U20962_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T511_U20963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T512_U20965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T513_U20966_M0> (22 requests):
- 19 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T514_U20967_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T515_U20968_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T516_U20969_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T517_U20971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T518_U20974_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T519_U20975_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T520_U20976_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T521_U20977_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T523_U20998_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T522_U20997_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T524_U21071_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T525_U21069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T526_U21370_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T527_U20984_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T528_U20985_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T529_U20986_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T530_U20987_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T531_U20988_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T532_U20992_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T533_U20994_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T534_U21096_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T535_U21000_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T536_U21001_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T537_U21002_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T538_U21004_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T539_U21007_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T540_U21008_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T541_U21009_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T542_U21010_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T543_U21012_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T544_U21013_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T545_U21014_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T546_U21015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T547_U21016_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T548_U21020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T549_U21103_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T550_U21049_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T551_U21047_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T552_U21048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T553_U21041_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T556_U21043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T555_U21042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T554_U21040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T557_U21129_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T558_U21031_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T559_U21035_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T560_U21036_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T561_U21037_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T562_U21050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T563_U21374_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T564_U21053_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T565_U21054_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T566_U21055_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T567_U21056_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T568_U21057_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T569_U21061_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T570_U21062_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T571_U21063_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T572_U21442_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T573_U21066_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T574_U21067_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T575_U21072_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T576_U21073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T577_U21097_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T578_U21075_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T579_U21076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T580_U21079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T581_U21080_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T582_U21081_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T583_U21082_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T584_U21084_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T585_U21085_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T586_U21086_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T587_U21088_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T588_U21089_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T589_U21092_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T590_U21094_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T591_U21098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T592_U21099_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T593_U21101_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T594_U21102_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T595_U21104_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T596_U21105_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T597_U21120_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T598_U21108_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T599_U21109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T600_U21110_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T601_U21183_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T602_U21113_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T603_U21114_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T604_U21115_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T605_U21117_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T606_U21118_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T607_U21121_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T608_U21122_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T609_U21473_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T610_U21125_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T611_U21375_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T612_U21130_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T613_U21131_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T614_U21133_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T615_U21134_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T616_U21135_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T617_U21136_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T618_U21138_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T619_U21141_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T620_U21142_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T621_U21144_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T622_U21145_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T623_U21146_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T624_U21147_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T625_U21148_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T626_U21152_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T627_U21157_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T628_U21159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T629_U21160_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T630_U21161_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T631_U21162_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T632_U21166_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T633_U21167_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T634_U21169_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T635_U21170_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T636_U21171_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T637_U21172_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T638_U21173_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T639_U21178_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T640_U21310_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T641_U21180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T642_U21184_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T643_U21185_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T644_U21187_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T645_U21188_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T646_U21189_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T647_U21190_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T648_U21191_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T649_U21192_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T650_U21193_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T651_U21197_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T652_U21198_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T653_U21199_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T654_U21201_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T655_U21202_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T656_U21203_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T657_U21204_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T658_U21205_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T659_U21209_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T660_U21210_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T661_U21211_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T662_U21212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T663_U21213_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T664_U21241_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T665_U21242_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T667_U21239_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T671_U21232_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T666_U21238_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T668_U21233_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T669_U21231_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T670_U21234_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T672_U21223_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T673_U21224_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T674_U21228_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T675_U21230_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T676_U21243_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T677_U21244_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T678_U21245_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T679_U21249_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T680_U21251_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T681_U21255_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T682_U21256_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T683_U21257_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T684_U21259_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T685_U21260_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T686_U21261_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T687_U21262_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T688_U21263_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T689_U21264_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T690_U21266_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T691_U21269_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T692_U21270_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T693_U21272_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T694_U21273_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T695_U21274_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T696_U21275_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T697_U21276_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T698_U21280_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T699_U21281_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T700_U21282_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T701_U21283_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T702_U21285_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T703_U21289_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T704_U21290_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T705_U21292_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T706_U21293_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T707_U21294_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T708_U21295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T709_U21296_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T710_U21297_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T711_U21301_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T712_U21303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T713_U21304_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T714_U21305_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T715_U21306_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T716_U21308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T717_U21315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T718_U21317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T719_U21318_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T720_U21322_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T721_U21323_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T722_U21325_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T723_U21326_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T724_U21327_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T725_U21328_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T726_U21330_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T727_U21333_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T728_U21334_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T729_U21335_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T730_U21336_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T731_U21337_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T732_U21339_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T733_U21340_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T734_U21341_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T735_U21342_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T736_U21346_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T737_U21347_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T738_U21348_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T739_U21349_M0> (14 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T740_U21351_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T741_U21352_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T742_U21353_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T743_U21354_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T744_U21355_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T745_U21356_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T746_U21358_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T747_U21361_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T748_U21363_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T749_U21364_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T750_U21365_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T751_U21366_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T752_U21368_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T753_U21372_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T754_U21377_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T755_U21443_M0> (13 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T756_U21379_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T757_U21380_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T758_U21382_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T759_U21384_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T760_U21385_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T761_U21386_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T762_U21388_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T763_U21389_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T764_U21390_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T765_U21392_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T766_U21394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T767_U21429_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T770_U21420_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T769_U21426_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T774_U21428_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T772_U21419_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T768_U21424_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T771_U21421_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T773_U21423_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T775_U21404_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T776_U21405_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T777_U21406_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T778_U21407_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T779_U21409_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T780_U21410_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T781_U21411_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T782_U21415_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T783_U21416_M0> (8 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T784_U21417_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T785_U21431_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T786_U21432_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T787_U21433_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T788_U21435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T789_U21437_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T790_U21439_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T791_U21444_M0> (3 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T792_U21445_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T793_U21447_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T794_U21448_M0> (13 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T796_U21451_M0> (4 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T797_U21452_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T798_U21453_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T799_U21455_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T800_U21456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T801_U21458_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T802_U21459_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T803_U21460_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T804_U21461_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T805_U21464_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T806_U21466_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T807_U21467_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T808_U21468_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T809_U21469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T810_U21471_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 05:24:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 188
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |11 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |217|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10550|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |high|T795_U21472_M0 |INTERNAL | | |
3| | | | |
|
| 4|19804 |DIA |WP_RUN | | |norm|T780_U21410_M0 |HTTP_NORM| | |
3| | | | |
|
| 5| |DIA |WP_KILL| |11 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |12 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |11 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 05:24:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|15 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|15 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|3 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|17 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|16 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T71_U20451_M0 | | |10.54.36.35 |04:49:29| |
|norm|3 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|3 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|SYNC_RFC |T75_U20558_M0 | | |smprd02.niladv.org |04:53:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|28 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|3 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|15 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|3 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|3 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|3 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|3 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|3 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|3 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|4 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|4 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|3 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|3 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|21 | | |
0|
|HTTP_NORMAL |T191_U20619_M0 | | |10.54.36.29 |04:56:48| |
|norm|3 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|4 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|16 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|21 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|4 | | |
0|
|HTTP_NORMAL |T208_U20991_M0 | | |10.54.36.11 |05:09:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T210_U20438_M0 | | |10.54.36.15 |04:49:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|6 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T213_U20687_M0 | | |10.54.36.11 |04:59:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T214_U20638_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|3 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|3 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|4 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|6 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|4 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T227_U20432_M0 | | |10.54.36.32 |04:48:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T237_U20433_M0 | | |10.54.36.27 |04:48:50| |
|norm|3 | | |
0|
|SYNC_RFC |T238_U20435_M0 | | |smprd02.niladv.org |04:48:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T239_U20436_M0 | | |10.50.47.10 |04:48:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T240_U20455_M0 | | |10.54.36.29 |04:49:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T241_U20440_M0 | | |10.54.36.25 |04:49:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T242_U20441_M0 | | |10.54.36.36 |04:49:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T243_U20466_M0 | | |10.54.36.37 |04:50:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T244_U20464_M0 | | |10.54.36.13 |04:50:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T245_U20504_M0 | | |10.54.36.33 |04:51:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T246_U20530_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T247_U20447_M0 | | |10.50.47.13 |04:49:12| |
|norm|4 | | |
0|
|SYNC_RFC |T248_U20456_M0 | | |smprd02.niladv.org |04:49:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T249_U20458_M0 | | |10.54.36.33 |04:49:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T250_U20467_M0 | | |10.54.36.28 |04:50:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T251_U20559_M0 | | |10.54.36.32 |04:53:50| |
|norm|20 | | |
0|
|SYNC_RFC |T252_U20469_M0 | | |smprd02.niladv.org |04:50:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T253_U20470_M0 | | |10.54.36.29 |04:50:38| |
|norm|15 | | |
0|
|SYNC_RFC |T254_U20474_M0 | | |smprd02.niladv.org |04:50:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T255_U20475_M0 | | |10.54.36.32 |04:50:50| |
|norm|20 | | |
0|
|SYNC_RFC |T256_U20748_M0 | | |smprd02.niladv.org |05:01:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T257_U20478_M0 | | |10.50.47.10 |04:50:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T258_U20479_M0 | | |10.54.36.40 |04:50:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T259_U20480_M0 | | |10.54.36.34 |04:50:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T260_U20481_M0 | | |10.54.36.19 |04:50:58| |
|norm|20 | | |
0|
|HTTP_NORMAL |T261_U20483_M0 | | |10.54.36.15 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T262_U20484_M0 | | |10.54.36.12 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T263_U20485_M0 | | |10.54.36.17 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T264_U20486_M0 | | |10.54.36.38 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T265_U20488_M0 | | |10.54.36.25 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T266_U20489_M0 | | |10.54.36.30 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T267_U20801_M0 | | |10.54.36.35 |05:03:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T268_U20492_M0 | | |10.50.47.13 |04:51:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T269_U20497_M0 | | |10.54.36.26 |04:51:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T270_U20802_M0 | | |10.54.36.13 |05:03:30| |
|norm|4 | | |
0|
|SYNC_RFC |T271_U20500_M0 | | |smprd02.niladv.org |04:51:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T272_U20501_M0 | | |10.54.36.41 |04:51:39| |
|norm|3 | | |
0|
|SYNC_RFC |T273_U20507_M0 | | |smprd02.niladv.org |04:52:01| |
|norm|1 | | |
0|
|SYNC_RFC |T274_U20509_M0 | | |smprd02.niladv.org |04:52:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T275_U20514_M0 | | |10.54.36.35 |04:52:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T276_U20537_M0 | | |10.54.36.34 |04:52:58| |
|norm|3 | | |
0|
|SYNC_RFC |T277_U20600_M0 | | |smprd02.niladv.org |04:55:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T278_U20536_M0 | | |10.50.47.10 |04:52:53| |
|norm|3 | | |
0|
|SYNC_RFC |T279_U20542_M0 | | |smprd02.niladv.org |04:53:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T280_U20543_M0 | | |10.54.36.12 |04:53:01| |
|norm|6 | | |
0|
|SYNC_RFC |T281_U20535_M0 | | |smprd02.niladv.org |04:52:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T282_U20544_M0 | | |10.54.36.17 |04:53:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T283_U20541_M0 | | |10.54.36.15 |04:53:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T284_U20523_M0 | | |10.54.36.13 |04:52:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T285_U20525_M0 | | |10.54.36.37 |04:52:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T286_U20526_M0 | | |10.54.36.28 |04:52:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T287_U20582_M0 | | |10.54.36.12 |04:55:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T288_U20578_M0 | | |10.54.36.29 |04:54:48| |
|norm|13 | | |
0|
|HTTP_NORMAL |T289_U20532_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|SYNC_RFC |T290_U20533_M0 | | |smprd02.niladv.org |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T291_U20546_M0 | | |10.54.36.38 |04:53:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T292_U20547_M0 | | |10.54.36.30 |04:53:05| |
|norm|4 | | |
0|
|SYNC_RFC |T293_U20809_M0 | | |smprd02.niladv.org |05:03:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T294_U20553_M0 | | |10.54.36.26 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T295_U20554_M0 | | |10.54.36.11 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T296_U20636_M0 | | |10.54.36.14 |04:57:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T297_U20560_M0 | | |10.54.36.27 |04:53:50| |
|norm|5 | | |
0|
|SYNC_RFC |T298_U20562_M0 | | |smprd02.niladv.org |04:53:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T299_U20563_M0 | | |10.54.36.33 |04:53:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T300_U20907_M0 | | |10.50.47.10 |05:06:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T301_U20567_M0 | | |10.54.36.25 |04:54:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T302_U20568_M0 | | |10.54.36.36 |04:54:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T303_U20574_M0 | | |10.54.36.37 |04:54:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T304_U20575_M0 | | |10.54.36.14 |04:54:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T305_U20580_M0 | | |10.50.47.10 |04:54:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T306_U20581_M0 | | |10.54.36.34 |04:54:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T307_U20585_M0 | | |10.54.36.17 |04:55:05| |
|norm|3 | | |
0|
|ASYNC_RFC |T308_U20586_M0 | | |10.54.36.10 |04:55:10| |
|norm|1 | | |
0|
|HTTP_NORMAL |T309_U20587_M0 | | |10.50.47.13 |04:55:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T310_U20592_M0 | | |10.54.36.13 |04:55:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T311_U20594_M0 | | |10.54.36.28 |04:55:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T312_U20595_M0 | | |10.54.36.26 |04:55:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T313_U20596_M0 | | |10.54.36.29 |04:55:38| |
|norm|15 | | |
0|
|SYNC_RFC |T314_U20597_M0 | | |smprd02.niladv.org |04:55:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T315_U20602_M0 | | |10.54.36.19 |04:55:58| |
|norm|19 | | |
0|
|HTTP_NORMAL |T316_U20603_M0 | | |10.54.36.15 |04:56:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T317_U20605_M0 | | |10.54.36.30 |04:56:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T318_U20606_M0 | | |10.54.36.38 |04:56:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T319_U20613_M0 | | |10.54.36.11 |04:56:32| |
|norm|3 | | |
0|
|SYNC_RFC |T320_U20615_M0 | | |smprd02.niladv.org |04:56:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T321_U20616_M0 | | |10.54.36.41 |04:56:40| |
|norm|3 | | |
0|
|SYNC_RFC |T322_U20620_M0 | | |smprd02.niladv.org |04:56:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T323_U20621_M0 | | |10.54.36.32 |04:56:50| |
|norm|12 | | |
0|
|HTTP_NORMAL |T324_U20622_M0 | | |10.54.36.27 |04:56:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T325_U20624_M0 | | |10.54.36.33 |04:56:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T326_U20626_M0 | | |10.54.36.36 |04:57:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T327_U20627_M0 | | |10.54.36.25 |04:57:03| |
|norm|3 | | |
0|
|SYNC_RFC |T328_U20630_M0 | | |smprd02.niladv.org |04:57:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T329_U20632_M0 | | |10.50.47.13 |04:57:13| |
|norm|4 | | |
0|
|HTTP_NORMAL |T330_U20639_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|SYNC_RFC |T331_U20641_M0 | | |smprd02.niladv.org |04:57:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T332_U20642_M0 | | |10.50.47.10 |04:57:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T333_U20643_M0 | | |10.54.36.34 |04:57:58| |
|norm|3 | | |
0|
|SYNC_RFC |T334_U20692_M0 | | |smprd02.niladv.org |04:59:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T335_U20645_M0 | | |10.54.36.12 |04:58:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T336_U20648_M0 | | |10.54.36.17 |04:58:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T337_U20653_M0 | | |10.54.36.35 |04:58:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T338_U20717_M0 | | |10.54.36.32 |05:00:50| |
|norm|3 | | |
0|
|SYNC_RFC |T339_U20673_M0 | | |smprd02.niladv.org |04:58:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T340_U20677_M0 | | |10.54.36.38 |04:59:02| |
|norm|4 | | |
0|
|SYNC_RFC |T341_U20690_M0 | | |smprd02.niladv.org |04:59:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T342_U20674_M0 | | |10.54.36.19 |04:58:58| |
|norm|12 | | |
0|
|HTTP_NORMAL |T343_U20691_M0 | | |10.54.36.29 |04:59:49| |
|norm|9 | | |
0|
|HTTP_NORMAL |T344_U20676_M0 | | |10.54.36.15 |04:59:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T345_U20679_M0 | | |10.54.36.30 |04:59:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T346_U20990_M0 | | |10.54.36.13 |05:09:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T347_U20664_M0 | | |10.54.36.26 |04:58:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T348_U20723_M0 | | |10.54.36.12 |05:01:01| |
|norm|4 | | |
0|
|SYNC_RFC |T349_U20669_M0 | | |smprd02.niladv.org |04:58:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T350_U20670_M0 | | |10.54.36.32 |04:58:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T351_U20671_M0 | | |10.54.36.27 |04:58:50| |
|norm|4 | | |
0|
|SYNC_RFC |T352_U20716_M0 | | |smprd02.niladv.org |05:00:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T353_U20694_M0 | | |10.54.36.33 |04:59:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T354_U20697_M0 | | |10.54.36.25 |05:00:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T355_U20698_M0 | | |10.54.36.36 |05:00:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T356_U20700_M0 | | |10.50.47.13 |05:00:12| |
|norm|3 | | |
0|
|SYNC_RFC |T357_U20702_M0 | | |smprd02.niladv.org |05:00:14| |
|norm|1 | | |
0|
|SYNC_RFC |T358_U20703_M0 | | |smprd02.niladv.org |05:00:16| |
|norm|1 | | |
0|
|SYNC_RFC |T359_U20704_M0 | | |smprd02.niladv.org |05:00:18| |
|norm|1 | | |
0|
|SYNC_RFC |T360_U20705_M0 | | |smprd02.niladv.org |05:00:20| |
|norm|1 | | |
0|
|SYNC_RFC |T361_U20707_M0 | | |smprd02.niladv.org |05:00:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T362_U20711_M0 | | |10.54.36.37 |05:00:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T363_U20712_M0 | | |10.54.36.14 |05:00:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T364_U20713_M0 | | |10.54.36.29 |05:00:38| |
|norm|14 | | |
0|
|SYNC_RFC |T365_U20714_M0 | | |smprd02.niladv.org |05:00:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T366_U20718_M0 | | |10.54.36.27 |05:00:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T367_U20720_M0 | | |10.50.47.10 |05:00:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T368_U20721_M0 | | |10.54.36.40 |05:00:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T369_U20722_M0 | | |10.54.36.34 |05:00:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T370_U20724_M0 | | |10.54.36.15 |05:01:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T371_U20725_M0 | | |10.54.36.17 |05:01:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T372_U20727_M0 | | |10.54.36.38 |05:01:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T373_U20738_M0 | | |10.54.36.35 |05:01:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T374_U20729_M0 | | |10.54.36.30 |05:01:04| |
|norm|3 | | |
0|
|SYNC_RFC |T375_U20731_M0 | | |smprd02.niladv.org |05:01:14| |
|norm|1 | | |
0|
|SYNC_RFC |T376_U20732_M0 | | |smprd02.niladv.org |05:01:16| |
|norm|1 | | |
0|
|SYNC_RFC |T377_U20733_M0 | | |smprd02.niladv.org |05:01:18| |
|norm|1 | | |
0|
|SYNC_RFC |T378_U20734_M0 | | |smprd02.niladv.org |05:01:20| |
|norm|1 | | |
0|
|SYNC_RFC |T379_U20736_M0 | | |smprd02.niladv.org |05:01:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T380_U20739_M0 | | |10.54.36.13 |05:01:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T381_U20740_M0 | | |10.54.36.26 |05:01:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T382_U20741_M0 | | |10.54.36.28 |05:01:32| |
|norm|6 | | |
0|
|HTTP_NORMAL |T383_U20742_M0 | | |10.54.36.11 |05:01:32| |
|norm|3 | | |
0|
|SYNC_RFC |T384_U20876_M0 | | |smprd02.niladv.org |05:05:50| |
|norm|1 | | |
0|
|SYNC_RFC |T385_U20745_M0 | | |smprd02.niladv.org |05:01:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T386_U20750_M0 | | |10.54.36.19 |05:01:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T387_U20751_M0 | | |10.54.36.33 |05:01:58| |
|norm|3 | | |
0|
|SYNC_RFC |T388_U20752_M0 | | |smprd02.niladv.org |05:02:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T389_U20754_M0 | | |10.54.36.25 |05:02:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T390_U20755_M0 | | |10.54.36.36 |05:02:04| |
|norm|3 | | |
0|
|SYNC_RFC |T391_U20758_M0 | | |smprd02.niladv.org |05:02:11| |
|norm|1 | | |
0|
|SYNC_RFC |T392_U20760_M0 | | |smprd02.niladv.org |05:02:14| |
|norm|1 | | |
0|
|SYNC_RFC |T393_U20761_M0 | | |smprd02.niladv.org |05:02:16| |
|norm|1 | | |
0|
|SYNC_RFC |T394_U20762_M0 | | |smprd02.niladv.org |05:02:18| |
|norm|1 | | |
0|
|SYNC_RFC |T395_U20763_M0 | | |smprd02.niladv.org |05:02:20| |
|norm|1 | | |
0|
|SYNC_RFC |T396_U20764_M0 | | |smprd02.niladv.org |05:02:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T397_U20769_M0 | | |10.54.36.37 |05:02:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T398_U20877_M0 | | |10.54.36.27 |05:05:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T399_U20774_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T400_U20775_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T401_U20776_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|SYNC_RFC |T402_U20777_M0 | | |smprd02.niladv.org |05:02:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T403_U20778_M0 | | |10.54.36.27 |05:02:50| |
|norm|5 | | |
0|
|HTTP_NORMAL |T404_U20779_M0 | | |10.54.36.32 |05:02:50| |
|norm|14 | | |
0|
|HTTP_NORMAL |T405_U20781_M0 | | |10.50.47.10 |05:02:53| |
|norm|3 | | |
0|
|SYNC_RFC |T406_U20782_M0 | | |smprd02.niladv.org |05:02:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T407_U20783_M0 | | |10.54.36.34 |05:02:58| |
|norm|3 | | |
0|
|SYNC_RFC |T408_U20784_M0 | | |smprd02.niladv.org |05:03:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T409_U20785_M0 | | |10.54.36.12 |05:03:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T410_U20786_M0 | | |10.54.36.17 |05:03:02| |
|norm|3 | | |
0|
|SYNC_RFC |T411_U21254_M0 | | |smprd02.niladv.org |05:17:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T412_U20789_M0 | | |10.54.36.38 |05:03:04| |
|norm|6 | | |
0|
|HTTP_NORMAL |T413_U20790_M0 | | |10.54.36.30 |05:03:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T414_U20793_M0 | | |10.50.47.13 |05:03:11| |
|norm|5 | | |
0|
|SYNC_RFC |T415_U20795_M0 | | |smprd02.niladv.org |05:03:14| |
|norm|1 | | |
0|
|SYNC_RFC |T416_U20796_M0 | | |smprd02.niladv.org |05:03:16| |
|norm|1 | | |
0|
|SYNC_RFC |T417_U20797_M0 | | |smprd02.niladv.org |05:03:18| |
|norm|1 | | |
0|
|SYNC_RFC |T418_U20798_M0 | | |smprd02.niladv.org |05:03:20| |
|norm|1 | | |
0|
|SYNC_RFC |T419_U20799_M0 | | |smprd02.niladv.org |05:03:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T420_U20803_M0 | | |10.54.36.26 |05:03:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T421_U20804_M0 | | |10.54.36.28 |05:03:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T422_U21176_M0 | | |10.54.36.28 |05:15:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T423_U20807_M0 | | |10.54.36.41 |05:03:39| |
|norm|3 | | |
0|
|SYNC_RFC |T424_U20810_M0 | | |smprd02.niladv.org |05:03:50| |
|norm|1 | | |
0|
|SYNC_RFC |T425_U20812_M0 | | |smprd02.niladv.org |05:03:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T426_U20813_M0 | | |10.54.36.19 |05:03:58| |
|norm|11 | | |
0|
|HTTP_NORMAL |T427_U20814_M0 | | |10.54.36.33 |05:03:58| |
|norm|3 | | |
0|
|SYNC_RFC |T428_U20844_M0 | | |smprd02.niladv.org |05:04:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T429_U20817_M0 | | |10.54.36.25 |05:04:03| |
|norm|4 | | |
0|
|SYNC_RFC |T430_U20818_M0 | | |smprd02.niladv.org |05:04:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T431_U20819_M0 | | |10.54.36.36 |05:04:04| |
|norm|4 | | |
0|
|SYNC_RFC |T432_U20821_M0 | | |smprd02.niladv.org |05:04:10| |
|norm|1 | | |
0|
|SYNC_RFC |T433_U20823_M0 | | |smprd02.niladv.org |05:04:14| |
|norm|1 | | |
0|
|SYNC_RFC |T434_U20824_M0 | | |smprd02.niladv.org |05:04:16| |
|norm|1 | | |
0|
|SYNC_RFC |T435_U20825_M0 | | |smprd02.niladv.org |05:04:18| |
|norm|1 | | |
0|
|SYNC_RFC |T436_U20827_M0 | | |smprd02.niladv.org |05:04:20| |
|norm|1 | | |
0|
|SYNC_RFC |T437_U20828_M0 | | |smprd02.niladv.org |05:04:22| |
|norm|1 | | |
0|
|SYNC_RFC |T438_U20961_M0 | | |smprd02.niladv.org |05:08:46| |
|norm|1 | | |
0|
|SYNC_RFC |T439_U20856_M0 | | |smprd02.niladv.org |05:05:10| |
|norm|1 | | |
0|
|SYNC_RFC |T440_U20854_M0 | | |smprd02.niladv.org |05:05:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T441_U20847_M0 | | |10.50.47.10 |05:04:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T442_U20851_M0 | | |10.54.36.17 |05:05:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T443_U20850_M0 | | |10.54.36.12 |05:05:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T444_U20853_M0 | | |10.54.36.15 |05:05:03| |
|norm|5 | | |
0|
|SYNC_RFC |T445_U20936_M0 | | |smprd02.niladv.org |05:07:45| |
|norm|1 | | |
0|
|HTTP_NORMAL |T446_U20840_M0 | | |10.54.36.37 |05:04:33| |
|norm|5 | | |
0|
|HTTP_NORMAL |T447_U20910_M0 | | |10.54.36.12 |05:07:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T448_U20845_M0 | | |10.54.36.29 |05:04:49| |
|norm|3 | | |
0|
|SYNC_RFC |T449_U20859_M0 | | |smprd02.niladv.org |05:05:14| |
|norm|1 | | |
0|
|SYNC_RFC |T450_U20860_M0 | | |smprd02.niladv.org |05:05:16| |
|norm|1 | | |
0|
|SYNC_RFC |T451_U20861_M0 | | |smprd02.niladv.org |05:05:18| |
|norm|1 | | |
0|
|SYNC_RFC |T452_U20863_M0 | | |smprd02.niladv.org |05:05:20| |
|norm|1 | | |
0|
|SYNC_RFC |T453_U20864_M0 | | |smprd02.niladv.org |05:05:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T454_U20867_M0 | | |10.54.36.35 |05:05:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T455_U20868_M0 | | |10.54.36.13 |05:05:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T456_U20869_M0 | | |10.54.36.26 |05:05:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T457_U20870_M0 | | |10.54.36.28 |05:05:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T458_U20872_M0 | | |10.54.36.29 |05:05:38| |
|norm|13 | | |
0|
|HTTP_NORMAL |T459_U20873_M0 | | |10.54.36.41 |05:05:40| |
|norm|3 | | |
0|
|SYNC_RFC |T460_U20874_M0 | | |smprd02.niladv.org |05:05:40| |
|norm|1 | | |
0|
|HTTP_NORMAL |T461_U20878_M0 | | |10.54.36.32 |05:05:51| |
|norm|20 | | |
0|
|HTTP_NORMAL |T462_U20880_M0 | | |10.54.36.19 |05:05:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T463_U20881_M0 | | |10.54.36.33 |05:05:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T464_U20911_M0 | | |10.54.36.17 |05:07:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T465_U20884_M0 | | |10.54.36.30 |05:06:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T466_U20886_M0 | | |10.54.36.38 |05:06:03| |
|norm|6 | | |
0|
|SYNC_RFC |T467_U20887_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T468_U20888_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T469_U20890_M0 | | |smprd02.niladv.org |05:06:14| |
|norm|1 | | |
0|
|SYNC_RFC |T470_U20891_M0 | | |smprd02.niladv.org |05:06:15| |
|norm|1 | | |
0|
|SYNC_RFC |T471_U20892_M0 | | |smprd02.niladv.org |05:06:17| |
|norm|1 | | |
0|
|SYNC_RFC |T472_U20893_M0 | | |smprd02.niladv.org |05:06:18| |
|norm|1 | | |
0|
|SYNC_RFC |T473_U20896_M0 | | |smprd02.niladv.org |05:06:20| |
|norm|1 | | |
0|
|SYNC_RFC |T474_U20897_M0 | | |smprd02.niladv.org |05:06:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T475_U20899_M0 | | |10.54.36.11 |05:06:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T476_U20901_M0 | | |10.54.36.37 |05:06:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T477_U20902_M0 | | |10.54.36.14 |05:06:37| |
|norm|3 | | |
0|
|SYNC_RFC |T478_U20903_M0 | | |smprd02.niladv.org |05:06:40| |
|norm|1 | | |
0|
|SYNC_RFC |T479_U20908_M0 | | |smprd02.niladv.org |05:06:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T480_U20909_M0 | | |10.54.36.34 |05:06:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T481_U20912_M0 | | |10.54.36.25 |05:07:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T482_U20914_M0 | | |10.54.36.36 |05:07:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T483_U20929_M0 | | |10.54.36.35 |05:07:29| |
|norm|3 | | |
0|
|SYNC_RFC |T484_U20916_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T485_U20917_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T486_U20918_M0 | | |smprd02.niladv.org |05:07:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T487_U20919_M0 | | |10.50.47.13 |05:07:12| |
|norm|3 | | |
0|
|SYNC_RFC |T488_U20921_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T489_U20922_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T490_U20923_M0 | | |smprd02.niladv.org |05:07:17| |
|norm|1 | | |
0|
|SYNC_RFC |T491_U20924_M0 | | |smprd02.niladv.org |05:07:19| |
|norm|1 | | |
0|
|SYNC_RFC |T492_U20926_M0 | | |smprd02.niladv.org |05:07:21| |
|norm|1 | | |
0|
|SYNC_RFC |T493_U20927_M0 | | |smprd02.niladv.org |05:07:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T494_U20930_M0 | | |10.54.36.13 |05:07:31| |
|norm|4 | | |
0|
|HTTP_NORMAL |T495_U20931_M0 | | |10.54.36.26 |05:07:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T496_U20933_M0 | | |10.54.36.28 |05:07:34| |
|norm|4 | | |
0|
|HTTP_NORMAL |T497_U20937_M0 | | |10.54.36.29 |05:07:48| |
|norm|11 | | |
0|
|HTTP_NORMAL |T498_U20938_M0 | | |10.54.36.29 |05:07:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T499_U20939_M0 | | |10.54.36.29 |05:07:49| |
|norm|3 | | |
0|
|SYNC_RFC |T500_U20941_M0 | | |smprd02.niladv.org |05:07:59| |
|norm|1 | | |
0|
|HTTP_NORMAL |T501_U20942_M0 | | |10.54.36.15 |05:08:01| |
|norm|5 | | |
0|
|SYNC_RFC |T502_U20946_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T503_U20947_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T504_U20949_M0 | | |smprd02.niladv.org |05:08:15| |
|norm|1 | | |
0|
|SYNC_RFC |T505_U20950_M0 | | |smprd02.niladv.org |05:08:17| |
|norm|1 | | |
0|
|SYNC_RFC |T506_U20951_M0 | | |smprd02.niladv.org |05:08:19| |
|norm|1 | | |
0|
|SYNC_RFC |T507_U20952_M0 | | |smprd02.niladv.org |05:08:21| |
|norm|1 | | |
0|
|SYNC_RFC |T508_U20953_M0 | | |smprd02.niladv.org |05:08:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T509_U20958_M0 | | |10.54.36.41 |05:08:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T510_U20962_M0 | | |10.54.36.32 |05:08:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T511_U20963_M0 | | |10.54.36.27 |05:08:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T512_U20965_M0 | | |10.54.36.33 |05:08:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T513_U20966_M0 | | |10.54.36.19 |05:08:58| |
|norm|22 | | |
0|
|SYNC_RFC |T514_U20967_M0 | | |smprd02.niladv.org |05:09:00| |
|norm|1 | | |
0|
|HTTP_NORMAL |T515_U20968_M0 | | |10.54.36.12 |05:09:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T516_U20969_M0 | | |10.54.36.30 |05:09:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T517_U20971_M0 | | |10.54.36.38 |05:09:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T518_U20974_M0 | | |10.54.36.17 |05:09:05| |
|norm|3 | | |
0|
|SYNC_RFC |T519_U20975_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T520_U20976_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T521_U20977_M0 | | |smprd02.niladv.org |05:09:08| |
|norm|1 | | |
0|
|SYNC_RFC |T522_U20997_M0 | | |smprd02.niladv.org |05:09:46| |
|norm|1 | | |
0|
|HTTP_NORMAL |T523_U20998_M0 | | |10.54.36.29 |05:09:48| |
|norm|6 | | |
0|
|HTTP_NORMAL |T524_U21071_M0 | | |10.50.47.10 |05:11:53| |
|norm|3 | | |
0|
|SYNC_RFC |T525_U21069_M0 | | |smprd02.niladv.org |05:11:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T526_U21370_M0 | | |10.54.36.35 |05:21:28| |
|norm|2 | | |
0|
|SYNC_RFC |T527_U20984_M0 | | |smprd02.niladv.org |05:09:15| |
|norm|1 | | |
0|
|SYNC_RFC |T528_U20985_M0 | | |smprd02.niladv.org |05:09:17| |
|norm|1 | | |
0|
|SYNC_RFC |T529_U20986_M0 | | |smprd02.niladv.org |05:09:19| |
|norm|1 | | |
0|
|SYNC_RFC |T530_U20987_M0 | | |smprd02.niladv.org |05:09:21| |
|norm|1 | | |
0|
|SYNC_RFC |T531_U20988_M0 | | |smprd02.niladv.org |05:09:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T532_U20992_M0 | | |10.54.36.37 |05:09:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T533_U20994_M0 | | |10.54.36.26 |05:09:33| |
|norm|3 | | |
0|
|SYNC_RFC |T534_U21096_M0 | | |smprd02.niladv.org |05:12:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T535_U21000_M0 | | |10.50.47.10 |05:09:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T536_U21001_M0 | | |10.54.36.34 |05:09:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T537_U21002_M0 | | |10.54.36.25 |05:10:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T538_U21004_M0 | | |10.54.36.36 |05:10:04| |
|norm|3 | | |
0|
|SYNC_RFC |T539_U21007_M0 | | |smprd02.niladv.org |05:10:06| |
|norm|1 | | |
0|
|SYNC_RFC |T540_U21008_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T541_U21009_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T542_U21010_M0 | | |smprd02.niladv.org |05:10:09| |
|norm|1 | | |
0|
|SYNC_RFC |T543_U21012_M0 | | |smprd02.niladv.org |05:10:15| |
|norm|1 | | |
0|
|SYNC_RFC |T544_U21013_M0 | | |smprd02.niladv.org |05:10:17| |
|norm|1 | | |
0|
|SYNC_RFC |T545_U21014_M0 | | |smprd02.niladv.org |05:10:19| |
|norm|1 | | |
0|
|SYNC_RFC |T546_U21015_M0 | | |smprd02.niladv.org |05:10:21| |
|norm|1 | | |
0|
|SYNC_RFC |T547_U21016_M0 | | |smprd02.niladv.org |05:10:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T548_U21020_M0 | | |10.54.36.35 |05:10:27| |
|norm|3 | | |
0|
|SYNC_RFC |T549_U21103_M0 | | |smprd02.niladv.org |05:13:01| |
|norm|1 | | |
0|
|SYNC_RFC |T550_U21049_M0 | | |smprd02.niladv.org |05:11:08| |
|norm|1 | | |
0|
|HTTP_NORMAL |T551_U21047_M0 | | |10.54.36.38 |05:11:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T552_U21048_M0 | | |10.54.36.17 |05:11:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T553_U21041_M0 | | |10.54.36.15 |05:11:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T554_U21040_M0 | | |10.54.36.33 |05:10:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T555_U21042_M0 | | |10.54.36.12 |05:11:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T556_U21043_M0 | | |10.54.36.30 |05:11:03| |
|norm|3 | | |
0|
|SYNC_RFC |T557_U21129_M0 | | |smprd02.niladv.org |05:13:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T558_U21031_M0 | | |10.54.36.29 |05:10:38| |
|norm|12 | | |
0|
|SYNC_RFC |T559_U21035_M0 | | |smprd02.niladv.org |05:10:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T560_U21036_M0 | | |10.54.36.32 |05:10:50| |
|norm|10 | | |
0|
|HTTP_NORMAL |T561_U21037_M0 | | |10.54.36.27 |05:10:50| |
|norm|4 | | |
0|
|SYNC_RFC |T562_U21050_M0 | | |smprd02.niladv.org |05:11:09| |
|norm|1 | | |
0|
|SYNC_RFC |T563_U21374_M0 | | |smprd02.niladv.org |05:21:49| |
|norm|1 | | |
0|
|SYNC_RFC |T564_U21053_M0 | | |smprd02.niladv.org |05:11:15| |
|norm|1 | | |
0|
|SYNC_RFC |T565_U21054_M0 | | |smprd02.niladv.org |05:11:17| |
|norm|1 | | |
0|
|SYNC_RFC |T566_U21055_M0 | | |smprd02.niladv.org |05:11:19| |
|norm|1 | | |
0|
|SYNC_RFC |T567_U21056_M0 | | |smprd02.niladv.org |05:11:21| |
|norm|1 | | |
0|
|SYNC_RFC |T568_U21057_M0 | | |smprd02.niladv.org |05:11:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T569_U21061_M0 | | |10.54.36.13 |05:11:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T570_U21062_M0 | | |10.54.36.11 |05:11:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T571_U21063_M0 | | |10.54.36.37 |05:11:33| |
|norm|3 | | |
0|
|SYNC_RFC |T572_U21442_M0 | | |smprd02.niladv.org |05:23:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T573_U21066_M0 | | |10.54.36.14 |05:11:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T574_U21067_M0 | | |10.54.36.41 |05:11:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T575_U21072_M0 | | |10.54.36.19 |05:11:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T576_U21073_M0 | | |10.54.36.34 |05:11:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T577_U21097_M0 | | |10.54.36.29 |05:12:49| |
|norm|3 | | |
0|
|SYNC_RFC |T578_U21075_M0 | | |smprd02.niladv.org |05:12:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T579_U21076_M0 | | |10.54.36.25 |05:12:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T580_U21079_M0 | | |10.54.36.36 |05:12:05| |
|norm|3 | | |
0|
|SYNC_RFC |T581_U21080_M0 | | |smprd02.niladv.org |05:12:08| |
|norm|1 | | |
0|
|SYNC_RFC |T582_U21081_M0 | | |smprd02.niladv.org |05:12:10| |
|norm|1 | | |
0|
|SYNC_RFC |T583_U21082_M0 | | |smprd02.niladv.org |05:12:11| |
|norm|1 | | |
0|
|SYNC_RFC |T584_U21084_M0 | | |smprd02.niladv.org |05:12:15| |
|norm|1 | | |
0|
|SYNC_RFC |T585_U21085_M0 | | |smprd02.niladv.org |05:12:17| |
|norm|1 | | |
0|
|SYNC_RFC |T586_U21086_M0 | | |smprd02.niladv.org |05:12:19| |
|norm|1 | | |
0|
|SYNC_RFC |T587_U21088_M0 | | |smprd02.niladv.org |05:12:21| |
|norm|1 | | |
0|
|SYNC_RFC |T588_U21089_M0 | | |smprd02.niladv.org |05:12:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T589_U21092_M0 | | |10.54.36.35 |05:12:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T590_U21094_M0 | | |10.54.36.28 |05:12:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T591_U21098_M0 | | |10.54.36.29 |05:12:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T592_U21099_M0 | | |10.54.36.29 |05:12:49| |
|norm|3 | | |
0|
|SYNC_RFC |T593_U21101_M0 | | |smprd02.niladv.org |05:12:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T594_U21102_M0 | | |10.54.36.33 |05:12:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T595_U21104_M0 | | |10.54.36.15 |05:13:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T596_U21105_M0 | | |10.54.36.12 |05:13:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T597_U21120_M0 | | |10.54.36.11 |05:13:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T598_U21108_M0 | | |10.54.36.30 |05:13:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T599_U21109_M0 | | |10.54.36.17 |05:13:05| |
|norm|3 | | |
0|
|SYNC_RFC |T600_U21110_M0 | | |smprd02.niladv.org |05:13:08| |
|norm|1 | | |
0|
|SYNC_RFC |T601_U21183_M0 | | |smprd02.niladv.org |05:15:48| |
|norm|1 | | |
0|
|SYNC_RFC |T602_U21113_M0 | | |smprd02.niladv.org |05:13:15| |
|norm|1 | | |
0|
|SYNC_RFC |T603_U21114_M0 | | |smprd02.niladv.org |05:13:17| |
|norm|1 | | |
0|
|SYNC_RFC |T604_U21115_M0 | | |smprd02.niladv.org |05:13:19| |
|norm|1 | | |
0|
|SYNC_RFC |T605_U21117_M0 | | |smprd02.niladv.org |05:13:21| |
|norm|1 | | |
0|
|SYNC_RFC |T606_U21118_M0 | | |smprd02.niladv.org |05:13:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T607_U21121_M0 | | |10.54.36.13 |05:13:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T608_U21122_M0 | | |10.54.36.37 |05:13:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T609_U21473_M0 | | |10.54.36.41 |05:24:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T610_U21125_M0 | | |10.54.36.14 |05:13:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T611_U21375_M0 | | |10.50.47.10 |05:21:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T612_U21130_M0 | | |10.54.36.32 |05:13:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T613_U21131_M0 | | |10.54.36.27 |05:13:50| |
|norm|4 | | |
0|
|SYNC_RFC |T614_U21133_M0 | | |smprd02.niladv.org |05:13:54| |
|norm|1 | | |
0|
|SYNC_RFC |T615_U21134_M0 | | |smprd02.niladv.org |05:13:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T616_U21135_M0 | | |10.54.36.19 |05:13:58| |
|norm|10 | | |
0|
|HTTP_NORMAL |T617_U21136_M0 | | |10.54.36.34 |05:13:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T618_U21138_M0 | | |10.54.36.38 |05:14:03| |
|norm|5 | | |
0|
|SYNC_RFC |T619_U21141_M0 | | |smprd02.niladv.org |05:14:08| |
|norm|1 | | |
0|
|SYNC_RFC |T620_U21142_M0 | | |smprd02.niladv.org |05:14:10| |
|norm|1 | | |
0|
|SYNC_RFC |T621_U21144_M0 | | |smprd02.niladv.org |05:14:15| |
|norm|1 | | |
0|
|SYNC_RFC |T622_U21145_M0 | | |smprd02.niladv.org |05:14:17| |
|norm|1 | | |
0|
|SYNC_RFC |T623_U21146_M0 | | |smprd02.niladv.org |05:14:19| |
|norm|1 | | |
0|
|SYNC_RFC |T624_U21147_M0 | | |smprd02.niladv.org |05:14:21| |
|norm|1 | | |
0|
|SYNC_RFC |T625_U21148_M0 | | |smprd02.niladv.org |05:14:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T626_U21152_M0 | | |10.54.36.35 |05:14:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T627_U21157_M0 | | |10.54.36.29 |05:14:50| |
|norm|6 | | |
0|
|HTTP_NORMAL |T628_U21159_M0 | | |10.50.47.10 |05:14:53| |
|norm|3 | | |
0|
|SYNC_RFC |T629_U21160_M0 | | |smprd02.niladv.org |05:14:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T630_U21161_M0 | | |10.54.36.36 |05:15:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T631_U21162_M0 | | |10.54.36.25 |05:15:03| |
|norm|2 | | |
0|
|SYNC_RFC |T632_U21166_M0 | | |smprd02.niladv.org |05:15:08| |
|norm|1 | | |
0|
|HTTP_NORMAL |T633_U21167_M0 | | |10.50.47.13 |05:15:12| |
|norm|5 | | |
0|
|SYNC_RFC |T634_U21169_M0 | | |smprd02.niladv.org |05:15:15| |
|norm|1 | | |
0|
|SYNC_RFC |T635_U21170_M0 | | |smprd02.niladv.org |05:15:17| |
|norm|1 | | |
0|
|SYNC_RFC |T636_U21171_M0 | | |smprd02.niladv.org |05:15:19| |
|norm|1 | | |
0|
|SYNC_RFC |T637_U21172_M0 | | |smprd02.niladv.org |05:15:21| |
|norm|1 | | |
0|
|SYNC_RFC |T638_U21173_M0 | | |smprd02.niladv.org |05:15:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T639_U21178_M0 | | |10.54.36.29 |05:15:39| |
|norm|10 | | |
0|
|HTTP_NORMAL |T640_U21310_M0 | | |10.54.36.35 |05:19:27| |
|norm|2 | | |
0|
|SYNC_RFC |T641_U21180_M0 | | |smprd02.niladv.org |05:15:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T642_U21184_M0 | | |10.54.36.27 |05:15:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T643_U21185_M0 | | |10.54.36.32 |05:15:51| |
|norm|15 | | |
0|
|SYNC_RFC |T644_U21187_M0 | | |smprd02.niladv.org |05:15:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T645_U21188_M0 | | |10.54.36.33 |05:15:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T646_U21189_M0 | | |10.54.36.34 |05:15:59| |
|norm|2 | | |
0|
|HTTP_NORMAL |T647_U21190_M0 | | |10.54.36.15 |05:16:00| |
|norm|4 | | |
0|
|HTTP_NORMAL |T648_U21191_M0 | | |10.54.36.12 |05:16:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T649_U21192_M0 | | |10.54.36.17 |05:16:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T650_U21193_M0 | | |10.54.36.30 |05:16:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T651_U21197_M0 | | |10.54.36.38 |05:16:04| |
|norm|5 | | |
0|
|SYNC_RFC |T652_U21198_M0 | | |smprd02.niladv.org |05:16:08| |
|norm|1 | | |
0|
|SYNC_RFC |T653_U21199_M0 | | |smprd02.niladv.org |05:16:10| |
|norm|1 | | |
0|
|SYNC_RFC |T654_U21201_M0 | | |smprd02.niladv.org |05:16:15| |
|norm|1 | | |
0|
|SYNC_RFC |T655_U21202_M0 | | |smprd02.niladv.org |05:16:17| |
|norm|1 | | |
0|
|SYNC_RFC |T656_U21203_M0 | | |smprd02.niladv.org |05:16:19| |
|norm|1 | | |
0|
|SYNC_RFC |T657_U21204_M0 | | |smprd02.niladv.org |05:16:21| |
|norm|1 | | |
0|
|SYNC_RFC |T658_U21205_M0 | | |smprd02.niladv.org |05:16:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T659_U21209_M0 | | |10.54.36.35 |05:16:29| |
|norm|2 | | |
0|
|HTTP_NORMAL |T660_U21210_M0 | | |10.54.36.13 |05:16:30| |
|norm|5 | | |
0|
|HTTP_NORMAL |T661_U21211_M0 | | |10.54.36.37 |05:16:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T662_U21212_M0 | | |10.54.36.26 |05:16:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T663_U21213_M0 | | |10.54.36.11 |05:16:32| |
|norm|2 | | |
0|
|SYNC_RFC |T664_U21241_M0 | | |smprd02.niladv.org |05:17:15| |
|norm|1 | | |
0|
|SYNC_RFC |T665_U21242_M0 | | |smprd02.niladv.org |05:17:17| |
|norm|1 | | |
0|
|SYNC_RFC |T666_U21238_M0 | | |smprd02.niladv.org |05:17:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T667_U21239_M0 | | |10.50.47.13 |05:17:13| |
|norm|2 | | |
0|
|HTTP_NORMAL |T668_U21233_M0 | | |10.54.36.36 |05:17:03| |
|norm|3 | | |
0|
|SYNC_RFC |T669_U21231_M0 | | |smprd02.niladv.org |05:16:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T670_U21234_M0 | | |10.54.36.25 |05:17:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T671_U21232_M0 | | |10.54.36.19 |05:16:57| |
|norm|5 | | |
0|
|HTTP_NORMAL |T672_U21223_M0 | | |10.54.36.14 |05:16:35| |
|norm|3 | | |
0|
|SYNC_RFC |T673_U21224_M0 | | |smprd02.niladv.org |05:16:41| |
|norm|1 | | |
0|
|SYNC_RFC |T674_U21228_M0 | | |smprd02.niladv.org |05:16:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T675_U21230_M0 | | |10.50.47.10 |05:16:54| |
|norm|3 | | |
0|
|SYNC_RFC |T676_U21243_M0 | | |smprd02.niladv.org |05:17:19| |
|norm|1 | | |
0|
|SYNC_RFC |T677_U21244_M0 | | |smprd02.niladv.org |05:17:21| |
|norm|1 | | |
0|
|SYNC_RFC |T678_U21245_M0 | | |smprd02.niladv.org |05:17:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T679_U21249_M0 | | |10.54.36.28 |05:17:33| |
|norm|5 | | |
0|
|HTTP_NORMAL |T680_U21251_M0 | | |10.54.36.41 |05:17:40| |
|norm|2 | | |
0|
|HTTP_NORMAL |T681_U21255_M0 | | |10.54.36.29 |05:17:49| |
|norm|5 | | |
0|
|HTTP_NORMAL |T682_U21256_M0 | | |10.54.36.29 |05:17:49| |
|norm|2 | | |
0|
|HTTP_NORMAL |T683_U21257_M0 | | |10.54.36.29 |05:17:49| |
|norm|2 | | |
0|
|SYNC_RFC |T684_U21259_M0 | | |smprd02.niladv.org |05:17:54| |
|norm|1 | | |
0|
|SYNC_RFC |T685_U21260_M0 | | |smprd02.niladv.org |05:17:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T686_U21261_M0 | | |10.54.36.33 |05:17:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T687_U21262_M0 | | |10.54.36.12 |05:18:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T688_U21263_M0 | | |10.54.36.15 |05:18:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T689_U21264_M0 | | |10.54.36.17 |05:18:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T690_U21266_M0 | | |10.54.36.30 |05:18:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T691_U21269_M0 | | |10.54.36.38 |05:18:04| |
|norm|4 | | |
0|
|SYNC_RFC |T692_U21270_M0 | | |smprd02.niladv.org |05:18:10| |
|norm|1 | | |
0|
|SYNC_RFC |T693_U21272_M0 | | |smprd02.niladv.org |05:18:15| |
|norm|1 | | |
0|
|SYNC_RFC |T694_U21273_M0 | | |smprd02.niladv.org |05:18:17| |
|norm|1 | | |
0|
|SYNC_RFC |T695_U21274_M0 | | |smprd02.niladv.org |05:18:19| |
|norm|1 | | |
0|
|SYNC_RFC |T696_U21275_M0 | | |smprd02.niladv.org |05:18:21| |
|norm|1 | | |
0|
|SYNC_RFC |T697_U21276_M0 | | |smprd02.niladv.org |05:18:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T698_U21280_M0 | | |10.54.36.13 |05:18:30| |
|norm|4 | | |
0|
|HTTP_NORMAL |T699_U21281_M0 | | |10.54.36.37 |05:18:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T700_U21282_M0 | | |10.54.36.26 |05:18:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T701_U21283_M0 | | |10.54.36.11 |05:18:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T702_U21285_M0 | | |10.54.36.14 |05:18:35| |
|norm|2 | | |
0|
|HTTP_NORMAL |T703_U21289_M0 | | |10.54.36.27 |05:18:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T704_U21290_M0 | | |10.54.36.32 |05:18:51| |
|norm|2 | | |
0|
|HTTP_NORMAL |T705_U21292_M0 | | |10.50.47.10 |05:18:54| |
|norm|2 | | |
0|
|SYNC_RFC |T706_U21293_M0 | | |smprd02.niladv.org |05:18:54| |
|norm|1 | | |
0|
|SYNC_RFC |T707_U21294_M0 | | |smprd02.niladv.org |05:18:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T708_U21295_M0 | | |10.54.36.34 |05:18:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T709_U21296_M0 | | |10.54.36.19 |05:18:58| |
|norm|14 | | |
0|
|HTTP_NORMAL |T710_U21297_M0 | | |10.54.36.25 |05:19:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T711_U21301_M0 | | |10.50.47.13 |05:19:13| |
|norm|3 | | |
0|
|SYNC_RFC |T712_U21303_M0 | | |smprd02.niladv.org |05:19:15| |
|norm|1 | | |
0|
|SYNC_RFC |T713_U21304_M0 | | |smprd02.niladv.org |05:19:17| |
|norm|1 | | |
0|
|SYNC_RFC |T714_U21305_M0 | | |smprd02.niladv.org |05:19:19| |
|norm|1 | | |
0|
|SYNC_RFC |T715_U21306_M0 | | |smprd02.niladv.org |05:19:21| |
|norm|1 | | |
0|
|SYNC_RFC |T716_U21308_M0 | | |smprd02.niladv.org |05:19:23| |
|norm|1 | | |
0|
|SYNC_RFC |T717_U21315_M0 | | |smprd02.niladv.org |05:19:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T718_U21317_M0 | | |10.54.36.12 |05:20:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T719_U21318_M0 | | |10.54.36.36 |05:20:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T720_U21322_M0 | | |10.54.36.17 |05:20:05| |
|norm|2 | | |
0|
|SYNC_RFC |T721_U21323_M0 | | |smprd02.niladv.org |05:20:10| |
|norm|1 | | |
0|
|SYNC_RFC |T722_U21325_M0 | | |smprd02.niladv.org |05:20:15| |
|norm|1 | | |
0|
|SYNC_RFC |T723_U21326_M0 | | |smprd02.niladv.org |05:20:17| |
|norm|1 | | |
0|
|SYNC_RFC |T724_U21327_M0 | | |smprd02.niladv.org |05:20:19| |
|norm|1 | | |
0|
|SYNC_RFC |T725_U21328_M0 | | |smprd02.niladv.org |05:20:21| |
|norm|1 | | |
0|
|SYNC_RFC |T726_U21330_M0 | | |smprd02.niladv.org |05:20:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T727_U21333_M0 | | |10.54.36.13 |05:20:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T728_U21334_M0 | | |10.54.36.11 |05:20:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T729_U21335_M0 | | |10.54.36.37 |05:20:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T730_U21336_M0 | | |10.54.36.28 |05:20:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T731_U21337_M0 | | |10.54.36.26 |05:20:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T732_U21339_M0 | | |10.54.36.14 |05:20:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T733_U21340_M0 | | |10.54.36.29 |05:20:39| |
|norm|9 | | |
0|
|HTTP_NORMAL |T734_U21341_M0 | | |10.54.36.41 |05:20:40| |
|norm|2 | | |
0|
|SYNC_RFC |T735_U21342_M0 | | |smprd02.niladv.org |05:20:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T736_U21346_M0 | | |10.54.36.29 |05:20:48| |
|norm|4 | | |
0|
|SYNC_RFC |T737_U21347_M0 | | |smprd02.niladv.org |05:20:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T738_U21348_M0 | | |10.54.36.27 |05:20:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T739_U21349_M0 | | |10.54.36.32 |05:20:51| |
|norm|14 | | |
0|
|HTTP_NORMAL |T740_U21351_M0 | | |10.54.36.40 |05:20:56| |
|norm|2 | | |
0|
|HTTP_NORMAL |T741_U21352_M0 | | |10.54.36.33 |05:20:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T742_U21353_M0 | | |10.54.36.34 |05:20:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T743_U21354_M0 | | |10.54.36.15 |05:21:00| |
|norm|2 | | |
0|
|HTTP_NORMAL |T744_U21355_M0 | | |10.54.36.25 |05:21:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T745_U21356_M0 | | |10.54.36.38 |05:21:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T746_U21358_M0 | | |10.54.36.30 |05:21:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T747_U21361_M0 | | |10.50.47.13 |05:21:13| |
|norm|3 | | |
0|
|SYNC_RFC |T748_U21363_M0 | | |smprd02.niladv.org |05:21:15| |
|norm|1 | | |
0|
|SYNC_RFC |T749_U21364_M0 | | |smprd02.niladv.org |05:21:17| |
|norm|1 | | |
0|
|SYNC_RFC |T750_U21365_M0 | | |smprd02.niladv.org |05:21:19| |
|norm|1 | | |
0|
|SYNC_RFC |T751_U21366_M0 | | |smprd02.niladv.org |05:21:21| |
|norm|1 | | |
0|
|SYNC_RFC |T752_U21368_M0 | | |smprd02.niladv.org |05:21:23| |
|norm|1 | | |
0|
|SYNC_RFC |T753_U21372_M0 | | |smprd02.niladv.org |05:21:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T754_U21377_M0 | | |10.54.36.19 |05:21:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T755_U21443_M0 | | |10.54.36.32 |05:23:50| |
|norm|13 | | |
0|
|SYNC_RFC |T756_U21379_M0 | | |smprd02.niladv.org |05:22:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T757_U21380_M0 | | |10.54.36.12 |05:22:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T758_U21382_M0 | | |10.54.36.36 |05:22:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T759_U21384_M0 | | |10.54.36.17 |05:22:05| |
|norm|2 | | |
0|
|SYNC_RFC |T760_U21385_M0 | | |smprd02.niladv.org |05:22:10| |
|norm|1 | | |
0|
|SYNC_RFC |T761_U21386_M0 | | |smprd02.niladv.org |05:22:11| |
|norm|1 | | |
0|
|SYNC_RFC |T762_U21388_M0 | | |smprd02.niladv.org |05:22:16| |
|norm|1 | | |
0|
|SYNC_RFC |T763_U21389_M0 | | |smprd02.niladv.org |05:22:17| |
|norm|1 | | |
0|
|SYNC_RFC |T764_U21390_M0 | | |smprd02.niladv.org |05:22:20| |
|norm|1 | | |
0|
|SYNC_RFC |T765_U21392_M0 | | |smprd02.niladv.org |05:22:22| |
|norm|1 | | |
0|
|SYNC_RFC |T766_U21394_M0 | | |smprd02.niladv.org |05:22:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T767_U21429_M0 | | |10.50.47.13 |05:23:13| |
|norm|1 | | |
0|
|SYNC_RFC |T768_U21424_M0 | | |smprd02.niladv.org |05:23:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T769_U21426_M0 | | |10.54.36.38 |05:23:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T770_U21420_M0 | | |10.54.36.33 |05:22:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T771_U21421_M0 | | |10.54.36.34 |05:22:59| |
|norm|1 | | |
0|
|SYNC_RFC |T772_U21419_M0 | | |smprd02.niladv.org |05:22:55| |
|norm|1 | | |
0|
|HTTP_NORMAL |T773_U21423_M0 | | |10.54.36.15 |05:23:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T774_U21428_M0 | | |10.54.36.30 |05:23:05| |
|norm|2 | | |
0|
|HTTP_NORMAL |T775_U21404_M0 | | |10.54.36.13 |05:22:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T776_U21405_M0 | | |10.54.36.11 |05:22:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T777_U21406_M0 | | |10.54.36.37 |05:22:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T778_U21407_M0 | | |10.54.36.26 |05:22:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T779_U21409_M0 | | |10.54.36.28 |05:22:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T780_U21410_M0 | | |10.54.36.14 |05:24:40|4 |
|norm|1 | | |
0|
|HTTP_NORMAL |T781_U21411_M0 | | |10.54.36.41 |05:22:40| |
|norm|1 | | |
0|
|HTTP_NORMAL |T782_U21415_M0 | | |10.54.36.29 |05:22:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T783_U21416_M0 | | |10.54.36.29 |05:22:48| |
|norm|8 | | |
0|
|HTTP_NORMAL |T784_U21417_M0 | | |10.54.36.29 |05:22:48| |
|norm|2 | | |
0|
|SYNC_RFC |T785_U21431_M0 | | |smprd02.niladv.org |05:23:16| |
|norm|1 | | |
0|
|SYNC_RFC |T786_U21432_M0 | | |smprd02.niladv.org |05:23:18| |
|norm|1 | | |
0|
|SYNC_RFC |T787_U21433_M0 | | |smprd02.niladv.org |05:23:20| |
|norm|1 | | |
0|
|SYNC_RFC |T788_U21435_M0 | | |smprd02.niladv.org |05:23:22| |
|norm|1 | | |
0|
|SYNC_RFC |T789_U21437_M0 | | |smprd02.niladv.org |05:23:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T790_U21439_M0 | | |10.54.36.35 |05:23:28| |
|norm|1 | | |
0|
|HTTP_NORMAL |T791_U21444_M0 | | |10.54.36.27 |05:23:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T792_U21445_M0 | | |10.50.47.10 |05:23:53| |
|norm|1 | | |
0|
|SYNC_RFC |T793_U21447_M0 | | |smprd02.niladv.org |05:23:55| |
|norm|1 | | |
0|
|HTTP_NORMAL |T794_U21448_M0 | | |10.54.36.19 |05:23:58| |
|norm|13 | | |
0|
|INTERNAL |T795_U21472_M0 | | | |05:24:40|3 |
|high| | | |
0|
|HTTP_NORMAL |T796_U21451_M0 | | |10.54.36.12 |05:24:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T797_U21452_M0 | | |10.54.36.25 |05:24:03| |
|norm|1 | | |
0|
|SYNC_RFC |T798_U21453_M0 | | |smprd02.niladv.org |05:24:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T799_U21455_M0 | | |10.54.36.17 |05:24:05| |
|norm|1 | | |
0|
|SYNC_RFC |T800_U21456_M0 | | |smprd02.niladv.org |05:24:10| |
|norm|1 | | |
0|
|SYNC_RFC |T801_U21458_M0 | | |smprd02.niladv.org |05:24:15| |
|norm|1 | | |
0|
|SYNC_RFC |T802_U21459_M0 | | |smprd02.niladv.org |05:24:16| |
|norm|1 | | |
0|
|SYNC_RFC |T803_U21460_M0 | | |smprd02.niladv.org |05:24:18| |
|norm|1 | | |
0|
|SYNC_RFC |T804_U21461_M0 | | |smprd02.niladv.org |05:24:20| |
|norm|1 | | |
0|
|SYNC_RFC |T805_U21464_M0 | | |smprd02.niladv.org |05:24:22| |
|norm|1 | | |
0|
|SYNC_RFC |T806_U21466_M0 | | |smprd02.niladv.org |05:24:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T807_U21467_M0 | | |10.54.36.11 |05:24:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T808_U21468_M0 | | |10.54.36.13 |05:24:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T809_U21469_M0 | | |10.54.36.37 |05:24:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T810_U21471_M0 | | |10.54.36.14 |05:24:37| |
|norm|1 | | |
0|

Found 810 logons with 810 sessions


Total ES (gross) memory of all sessions: 62 MB
Most ES (gross) memory allocated by T47_U9774_M0: 8 MB

Force ABAP stack dump of session T12_U20220_M0


Force ABAP stack dump of session T23_U20175_M0
Force ABAP stack dump of session T38_U20216_M0
Force ABAP stack dump of session T61_U20113_M0

Sun Sep 22 05:24:43:900 2019


Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T121_U20226_M0
Force ABAP stack dump of session T190_U20350_M0
Force ABAP stack dump of session T197_U20347_M0
Force ABAP stack dump of session T200_U20354_M0
Force ABAP stack dump of session T251_U20559_M0
Force ABAP stack dump of session T253_U20470_M0
Force ABAP stack dump of session T255_U20475_M0
Force ABAP stack dump of session T260_U20481_M0
Force ABAP stack dump of session T288_U20578_M0
Force ABAP stack dump of session T313_U20596_M0
Force ABAP stack dump of session T315_U20602_M0
Force ABAP stack dump of session T323_U20621_M0
Force ABAP stack dump of session T342_U20674_M0
Force ABAP stack dump of session T364_U20713_M0
Force ABAP stack dump of session T404_U20779_M0
Force ABAP stack dump of session T426_U20813_M0
Force ABAP stack dump of session T458_U20872_M0
Force ABAP stack dump of session T461_U20878_M0
Force ABAP stack dump of session T497_U20937_M0
Force ABAP stack dump of session T513_U20966_M0
Force ABAP stack dump of session T558_U21031_M0
Force ABAP stack dump of session T560_U21036_M0
Force ABAP stack dump of session T616_U21135_M0
Force ABAP stack dump of session T639_U21178_M0
Force ABAP stack dump of session T643_U21185_M0
Force ABAP stack dump of session T709_U21296_M0
Force ABAP stack dump of session T739_U21349_M0
Force ABAP stack dump of session T755_U21443_M0
Force ABAP stack dump of session T794_U21448_M0

RFC-Connection Table (289 entries) Sun Sep 22 05:24:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 2|54277723|54277723SU21134_M0 |T615_U21134_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 4|53822060|53822060SU20926_M0 |T492_U20926_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 5|53882124|53882124SU20949_M0 |T504_U20949_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 6|54577699|54577699SU21273_M0 |T694_U21273_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 7|54205118|54205118SU21101_M0 |T593_U21101_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 10|54372533|54372533SU21171_M0 |T636_U21171_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 11|53813726|53813726SU20922_M0 |T489_U20922_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 12|53370125|53370125SU20716_M0 |T352_U20716_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 13|53891128|53891128SU20952_M0 |T507_U20952_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 14|54748042|54748042SU21347_M0 |T737_U21347_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 16|53750985|53750985SU20896_M0 |T473_U20896_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 18|54702717|54702717SU21323_M0 |T721_U21323_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 19|54366552|54366552SU21169_M0 |T634_U21169_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 20|53819053|53819053SU20924_M0 |T491_U20924_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 21|53677905|53677905SU20861_M0 |T451_U20861_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 23|53812054|53812054SU20921_M0 |T488_U20921_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 24|54369536|54369536SU21170_M0 |T635_U21170_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 25|53941255|53941255SU20976_M0 |T520_U20976_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 26|54305161|54305161SU20692_M0 |T334_U20692_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 27|54586777|54586777SU21276_M0 |T697_U21276_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 28|53871130|53871130SU20946_M0 |T502_U20946_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 29|54554387|54554387SU21260_M0 |T685_U21260_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 30|54448632|54448632SU21205_M0 |T658_U21205_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 31|53533790|53533790SU20795_M0 |T415_U20795_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 32|53831465|53831465SU20509_M0 |T274_U20509_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 33|54303059|54303059SU20690_M0 |T341_U20690_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 34|53477705|53477705SU20764_M0 |T396_U20764_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 35|53807465|53807465SU20918_M0 |T486_U20918_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 37|53606863|53606863SU20824_M0 |T434_U20824_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 38|53461465|53461465SU20758_M0 |T391_U20758_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 39|54714894|54714894SU21327_M0 |T724_U21327_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 41|54397087|54397087SU21180_M0 |T641_U21180_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 42|53536792|53536792SU20796_M0 |T416_U20796_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 43|53659094|53659094SU20854_M0 |T440_U20854_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 44|54184460|54184460SU20641_M0 |T331_U20641_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 45|53745004|53745004SU20892_M0 |T471_U20892_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 48|54172368|54172368SU21089_M0 |T588_U21089_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 50|53747978|53747978SU20893_M0 |T472_U20893_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 51|54517702|54517702SU21245_M0 |T678_U21245_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 53|53816065|53816065SU20923_M0 |T490_U20923_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 55|54739047|54739047SU21342_M0 |T735_U21342_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 56|53666302|53666302SU20856_M0 |T439_U20856_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 57|53869968|53869968SU20533_M0 |T290_U20533_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 58|54720907|54720907SU21330_M0 |T726_U21330_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 59|54653834|54653834SU21308_M0 |T716_U21308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 60|53799525|53799525SU20916_M0 |T484_U20916_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 61|54583761|54583761SU21275_M0 |T696_U21275_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 62|54199046|54199046SU21096_M0 |T534_U21096_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 63|54717900|54717900SU21328_M0 |T725_U21328_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 64|54375579|54375579SU21172_M0 |T637_U21172_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 65|54778897|54778897SU21364_M0 |T749_U21364_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 66|54580753|54580753SU21274_M0 |T695_U21274_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 67|54240969|54240969SU20669_M0 |T349_U20669_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 68|53474712|53474712SU20763_M0 |T395_U20763_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 69|53609831|53609831SU20825_M0 |T435_U20825_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 70|54213601|54213601SU21103_M0 |T549_U21103_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 71|53603861|53603861SU20823_M0 |T433_U20823_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 72|53504970|53504970SU20777_M0 |T402_U20777_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 73|53627349|53627349SU20435_M0 |T238_U20435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 75|54293525|54293525SU21142_M0 |T620_U21142_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 76|53740983|53740983SU20890_M0 |T469_U20890_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 77|54775980|54775980SU21363_M0 |T748_U21363_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 79|54378556|54378556SU21173_M0 |T638_U21173_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 80|54405977|54405977SU21183_M0 |T601_U21183_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 81|53539769|53539769SU20797_M0 |T417_U20797_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 82|53542786|53542786SU20798_M0 |T418_U20798_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 83|54031268|54031268SU21015_M0 |T546_U21015_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 84|54055969|54055969SU20600_M0 |T277_U20600_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 85|53753961|53753961SU20897_M0 |T474_U20897_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 86|54508676|54508676SU21242_M0 |T665_U21242_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 87|54290760|54290760SU21141_M0 |T619_U21141_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 89|54274191|54274191SU21133_M0 |T614_U21133_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 90|53335572|53335572SU20704_M0 |T359_U20704_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 91|54574766|54574766SU21272_M0 |T693_U21272_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 92|53952146|53952146SU20984_M0 |T527_U20984_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 93|53933395|53933395SU20967_M0 |T514_U20967_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 95|53964177|53964177SU20988_M0 |T531_U20988_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 96|53683884|53683884SU20864_M0 |T453_U20864_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 97|54679972|54679972SU21315_M0 |T717_U21315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 98|32252616|32252616SU20586_M0 |T308_U20586_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 99|53683971|53683971SU20456_M0 |T248_U20456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 100|53884618|53884618SU20542_M0 |T279_U20542_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 101|53875871|53875871SU20535_M0 |T281_U20535_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 102|54568649|54568649SU21270_M0 |T692_U21270_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 103|53955198|53955198SU20985_M0 |T528_U20985_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 104|53961202|53961202SU20987_M0 |T530_U20987_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 105|54781967|54781967SU21365_M0 |T750_U21365_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 106|53713330|53713330SU20876_M0 |T384_U20876_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 107|54311482|54311482SU21148_M0 |T625_U21148_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 108|54009269|54009269SU21007_M0 |T539_U21007_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 109|53791976|53791976SU20908_M0 |T479_U20908_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 112|54241442|54241442SU21118_M0 |T606_U21118_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 113|53825033|53825033SU20927_M0 |T493_U20927_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 115|53872183|53872183SU20947_M0 |T503_U20947_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 116|54544123|54544123SU21254_M0 |T411_U21254_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 118|53918034|53918034SU20961_M0 |T438_U20961_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 119|54302460|54302460SU21145_M0 |T622_U21145_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 120|54711822|54711822SU21326_M0 |T723_U21326_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 121|53450513|53450513SU20752_M0 |T388_U20752_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 122|54618765|54618765SU21293_M0 |T706_U21293_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 123|54511679|54511679SU21243_M0 |T676_U21243_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 124|54169405|54169405SU21088_M0 |T587_U21088_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 125|54308505|54308505SU21147_M0 |T624_U21147_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 126|54246530|54246530SU20673_M0 |T339_U20673_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 129|54268124|54268124SU21129_M0 |T557_U21129_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 130|53730608|53730608SU20888_M0 |T468_U20888_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 131|53863051|53863051SU20941_M0 |T500_U20941_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 132|54505692|54505692SU21241_M0 |T664_U21241_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 133|54022221|54022221SU21012_M0 |T543_U21012_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 134|54358834|54358834SU21166_M0 |T632_U21166_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 135|53772320|53772320SU20903_M0 |T478_U20903_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 136|54013979|54013979SU21009_M0 |T541_U21009_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 137|54345802|54345802SU21160_M0 |T629_U21160_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 138|53820547|53820547SU20507_M0 |T273_U20507_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 139|54708905|54708905SU21325_M0 |T722_U21325_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 140|54235438|54235438SU21115_M0 |T604_U21115_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 141|54644747|54644747SU21304_M0 |T713_U21304_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 142|53888093|53888093SU20951_M0 |T506_U20951_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 143|53958159|53958159SU20986_M0 |T529_U20986_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 145|53615810|53615810SU20828_M0 |T437_U20828_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 146|54238481|54238481SU21117_M0 |T605_U21117_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 147|54436627|54436627SU21201_M0 |T654_U21201_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 149|53338570|53338570SU20705_M0 |T360_U20705_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 152|54221690|54221690SU21110_M0 |T600_U21110_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 153|53674932|53674932SU20860_M0 |T450_U20860_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 154|54028234|54028234SU21014_M0 |T545_U21014_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 157|54093287|54093287SU21054_M0 |T565_U21054_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 158|53573044|53573044SU20809_M0 |T293_U20809_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 159|54514686|54514686SU21244_M0 |T677_U21244_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 160|53405636|53405636SU20734_M0 |T378_U20734_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 161|54650835|54650835SU21306_M0 |T715_U21306_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 162|54160337|54160337SU21084_M0 |T584_U21084_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 163|54083666|54083666SU21050_M0 |T562_U21050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 164|54229412|54229412SU21113_M0 |T602_U21113_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 165|53591025|53591025SU20818_M0 |T430_U20818_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 166|53940202|53940202SU20975_M0 |T519_U20975_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 167|54430592|54430592SU21199_M0 |T653_U21199_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 169|53671936|53671936SU20859_M0 |T449_U20859_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 170|53796298|53796298SU20500_M0 |T271_U20500_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 171|53575295|53575295SU20810_M0 |T424_U20810_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 172|54989183|54989183SU21461_M0 |T804_U21461_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 173|54500465|54500465SU21238_M0 |T666_U21238_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 174|54015616|54015616SU21010_M0 |T542_U21010_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 175|53402644|53402644SU20733_M0 |T377_U20733_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 176|54333571|54333571SU20703_M0 |T358_U20703_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 177|54330571|54330571SU20702_M0 |T357_U20702_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 178|53359275|53359275SU20714_M0 |T365_U20714_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 179|54647828|54647828SU21305_M0 |T714_U21305_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 180|53426328|53426328SU20745_M0 |T385_U20745_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 181|54427906|54427906SU21198_M0 |T652_U21198_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 182|54081559|54081559SU21049_M0 |T550_U21049_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 183|54166366|54166366SU21086_M0 |T586_U21086_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 184|54025224|54025224SU21013_M0 |T544_U21013_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 185|54150626|54150626SU21080_M0 |T581_U21080_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 186|54128970|54128970SU21069_M0 |T525_U21069_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 187|53341583|53341583SU20707_M0 |T361_U20707_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 188|54828522|54828522SU21379_M0 |T756_U21379_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 189|54967717|54967717SU21453_M0 |T798_U21453_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 190|53894111|53894111SU20953_M0 |T508_U20953_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 191|54485316|54485316SU21231_M0 |T669_U21231_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 192|54232387|54232387SU21114_M0 |T603_U21114_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 193|53933040|53933040SU20558_M0 |T75_U20558_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 194|53944437|53944437SU20977_M0 |T521_U20977_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 195|53702262|53702262SU20874_M0 |T460_U20874_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 196|54090292|54090292SU21053_M0 |T564_U21053_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 197|54550695|54550695SU21259_M0 |T684_U21259_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 199|53643123|53643123SU20844_M0 |T428_U20844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 200|53612837|53612837SU20827_M0 |T436_U20827_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 201|53399648|53399648SU20732_M0 |T376_U20732_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 202|54305468|54305468SU21146_M0 |T623_U21146_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 204|53438225|53438225SU20748_M0 |T256_U20748_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 205|53746045|53746045SU20474_M0 |T254_U20474_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 206|54442606|54442606SU21203_M0 |T656_U21203_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 207|54096290|54096290SU21055_M0 |T566_U21055_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 208|54034234|54034234SU21016_M0 |T547_U21016_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 209|53580137|53580137SU20812_M0 |T425_U20812_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 212|53519586|53519586SU20784_M0 |T408_U20784_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 213|53545747|53545747SU20799_M0 |T419_U20799_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 214|54476047|54476047SU21228_M0 |T674_U21228_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 216|54141465|54141465SU20630_M0 |T328_U20630_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 218|54925090|54925090SU21437_M0 |T789_U21437_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 220|53885142|53885142SU20950_M0 |T505_U20950_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 221|54118046|54118046SU20620_M0 |T322_U20620_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 222|54890268|54890268SU21419_M0 |T772_U21419_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 223|54641839|54641839SU21303_M0 |T712_U21303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 224|54919111|54919111SU21433_M0 |T787_U21433_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 225|53848955|53848955SU20936_M0 |T445_U20936_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 226|54622455|54622455SU21294_M0 |T707_U21294_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 227|54012478|54012478SU21008_M0 |T540_U21008_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 230|54815117|54815117SU21374_M0 |T563_U21374_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 231|54439604|54439604SU21202_M0 |T655_U21202_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 232|54299485|54299485SU21144_M0 |T621_U21144_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 233|54858049|54858049SU21394_M0 |T766_U21394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 234|54107318|54107318SU20615_M0 |T320_U20615_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 235|54099331|54099331SU21056_M0 |T567_U21056_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 236|53598228|53598228SU20821_M0 |T432_U20821_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 237|54950971|54950971SU21442_M0 |T572_U21442_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 238|53988107|53988107SU20997_M0 |T522_U20997_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 239|53468725|53468725SU20761_M0 |T393_U20761_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 240|54983185|54983185SU21459_M0 |T802_U21459_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 242|53742654|53742654SU20891_M0 |T470_U20891_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 243|54913113|54913113SU21431_M0 |T785_U21431_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 244|54142537|54142537SU21075_M0 |T578_U21075_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 245|54855048|54855048SU21392_M0 |T765_U21392_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 247|54958316|54958316SU21447_M0 |T793_U21447_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 248|54414846|54414846SU21187_M0 |T644_U21187_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 250|53800681|53800681SU20917_M0 |T485_U20917_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 251|54155465|54155465SU21082_M0 |T583_U21082_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 252|53938941|53938941SU20562_M0 |T298_U20562_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 253|53408645|53408645SU20736_M0 |T379_U20736_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 254|53680911|53680911SU20863_M0 |T452_U20863_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 255|53734228|53734228SU20469_M0 |T252_U20469_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 256|54445653|54445653SU21204_M0 |T657_U21204_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 257|54163344|54163344SU21085_M0 |T585_U21085_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 258|54848964|54848964SU21389_M0 |T763_U21389_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 259|53465735|53465735SU20760_M0 |T392_U20760_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 260|54916038|54916038SU21432_M0 |T786_U21432_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 261|54467123|54467123SU21224_M0 |T673_U21224_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 263|54045247|54045247SU20597_M0 |T314_U20597_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 264|54787981|54787981SU21368_M0 |T752_U21368_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 265|53471695|53471695SU20762_M0 |T394_U20762_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 266|54784975|54784975SU21366_M0 |T751_U21366_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 267|54153466|54153466SU21081_M0 |T582_U21081_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 269|54806114|54806114SU21372_M0 |T753_U21372_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 271|53396676|53396676SU20731_M0 |T375_U20731_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 273|54986113|54986113SU21460_M0 |T803_U21460_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 274|54102300|54102300SU21057_M0 |T568_U21057_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 276|54846049|54846049SU21388_M0 |T762_U21388_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 277|54062174|54062174SU21035_M0 |T559_U21035_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 278|54838787|54838787SU21385_M0 |T760_U21385_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 279|54981013|54981013SU21458_M0 |T801_U21458_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 280|54922085|54922085SU21435_M0 |T788_U21435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 281|54975858|54975858SU21456_M0 |T800_U21456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 282|53511057|53511057SU20782_M0 |T406_U20782_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 283|54840465|54840465SU21386_M0 |T761_U21386_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 284|53729452|53729452SU20887_M0 |T467_U20887_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 286|54897592|54897592SU21424_M0 |T768_U21424_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 288|54852040|54852040SU21390_M0 |T764_U21390_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 289|54992105|54992105SU21464_M0 |T805_U21464_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 290|54995162|54995162SU21466_M0 |T806_U21466_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |

Found 289 RFC-Connections

CA Blocks
------------------------------------------------------------
0 WORKER 8468
1 WORKER 32121
333 INVALID -1
334 INVALID -1
335 INVALID -1
336 INVALID -1
337 INVALID -1
338 INVALID -1
339 INVALID -1
340 INVALID -1
341 INVALID -1
342 INVALID -1
343 INVALID -1
344 INVALID -1
345 INVALID -1
346 INVALID -1
347 INVALID -1
348 INVALID -1
349 INVALID -1
350 INVALID -1
351 INVALID -1
352 INVALID -1
353 INVALID -1
354 INVALID -1
355 INVALID -1
356 INVALID -1
357 INVALID -1
358 INVALID -1
359 INVALID -1
360 INVALID -1
361 INVALID -1
362 INVALID -1
363 INVALID -1
364 INVALID -1
365 INVALID -1
366 INVALID -1
367 INVALID -1
368 INVALID -1
369 INVALID -1
370 INVALID -1
371 INVALID -1
372 INVALID -1
373 INVALID -1
374 INVALID -1
375 INVALID -1
376 INVALID -1
377 INVALID -1
378 INVALID -1
379 INVALID -1
380 INVALID -1
381 INVALID -1
382 INVALID -1
383 INVALID -1
384 INVALID -1
385 INVALID -1
386 INVALID -1
387 INVALID -1
388 INVALID -1
389 INVALID -1
390 INVALID -1
391 INVALID -1
392 INVALID -1
393 INVALID -1
394 INVALID -1
395 INVALID -1
396 INVALID -1
397 INVALID -1
398 INVALID -1
399 INVALID -1
400 INVALID -1
401 INVALID -1
402 INVALID -1
403 INVALID -1
404 INVALID -1
405 INVALID -1
406 INVALID -1
407 INVALID -1
408 INVALID -1
409 INVALID -1
410 INVALID -1
411 INVALID -1
412 INVALID -1
413 INVALID -1
414 INVALID -1
415 INVALID -1
416 INVALID -1
417 INVALID -1
418 INVALID -1
419 INVALID -1
420 INVALID -1
421 INVALID -1
422 INVALID -1
423 INVALID -1
424 INVALID -1
425 INVALID -1
426 INVALID -1
427 INVALID -1
428 INVALID -1
429 INVALID -1
430 INVALID -1
... skip next entries
100 ca_blk slots of 6000 in use, 98 currently unowned (in request queues)
MPI Info Sun Sep 22 05:24:43 2019
------------------------------------------------------------
Current pipes in use: 217
Current / maximal blocks in use: 258 / 1884

Periodic Tasks Sun Sep 22 05:24:43 2019


------------------------------------------------------------

|Handle |Type |Calls |Wait(sec) |Session |Resp-ID


|
|--------|--------------------|----------|----------|--------------------|---------
-|
| 0|BUFREF | 5721| 77| |
|
| 1|DDLOG | 5721| 77| |
|
| 2|BTCSCHED | 11439| 21| |
|
| 3|RESTART_ALL | 2288| 13| |
|
| 4|ENVCHECK | 34333| 20| |
|
| 5|AUTOABAP | 2288| 13| |
|
| 6|BGRFC_WATCHDOG | 2289| 13| |
|
| 7|AUTOTH | 323| 21| |
|
| 8|AUTOCCMS | 11439| 21| |
|
| 9|AUTOSECURITY | 11438| 21| |
|
| 10|LOAD_CALCULATION | 685712| 1| |
|
| 11|SPOOLALRM | 11444| 21| |
|
| 12|CALL_DELAYED | 0| 3894| |
|

Found 13 periodic tasks

********** SERVER SNAPSHOT 188 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:24:43 2019) - end **********

***LOG Q41=> DpDumpInternalTables, () [dpxxdisp.c 3624]


*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:24:48:873 2019


DpHdlSoftCancel: cancel request for T781_U21411_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:24:53:878 2019


DpHdlSoftCancel: cancel request for T783_U21416_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:24:59:884 2019


DpHdlSoftCancel: cancel request for T771_U21421_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T770_U21420_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:25:00:415 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:25:00:913 2019


DpHdlSoftCancel: cancel request for T773_U21423_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:25:03:868 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W0-2299
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W1-2300
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W5-2301
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W6-2302
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpWpDynCreate: created new work process W7-2303

Sun Sep 22 05:25:04:165 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:25:05:459 2019


*** ERROR => DpHdlDeadWp: W0 (pid 2299) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2299) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 2299)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 2300) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2300) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 2300)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 2301) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2301) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 2301)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 2302) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2302) exited with exit code 255
DpWpRecoverMutex: recover resources of W6 (pid = 2302)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 2303) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=2303) exited with exit code 255
DpWpRecoverMutex: recover resources of W7 (pid = 2303)
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:25:05:918 2019


DpHdlSoftCancel: cancel request for T769_U21426_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T774_U21428_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:25:18:971 2019


DpHdlSoftCancel: cancel request for T767_U21429_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:25:23:869 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot

Sun Sep 22 05:25:24:187 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:25:29:933 2019


DpHdlSoftCancel: cancel request for T790_U21439_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:25:39:941 2019


DpHdlSoftCancel: cancel request for T733_U21340_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:25:43:870 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: delayed snapshot exist, skip new snapshot
DpTriggerSapSnapshot: start /usr/sap/SMP/DVEBMGS00/exe/sapcontrol
DpTriggerSapSnapshot: sapcontrol runs with pid 2600

Sun Sep 22 05:25:54:991 2019


DpHdlSoftCancel: cancel request for T755_U21443_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T791_U21444_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T792_U21445_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:25:59:992 2019


DpHdlSoftCancel: cancel request for T794_U21448_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:26:03:870 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:26:04:167 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
DpHdlSoftCancel: cancel request for T796_U21451_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T797_U21452_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:26:09:223 2019


DpHdlSoftCancel: cancel request for T799_U21455_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:26:22:605 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:26:23:871 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
DpCheckSapcontrolProcess: sapcontrol with pid 2600 terminated

Sun Sep 22 05:26:34:241 2019


DpHdlSoftCancel: cancel request for T807_U21467_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T808_U21468_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T809_U21469_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:26:39:243 2019


DpHdlSoftCancel: cancel request for T810_U21471_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:26:43:871 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-3215
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:26:44:245 2019


DpHdlSoftCancel: cancel request for T609_U21473_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:26:44:818 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 3215) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=3215) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 3215)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:26:54:254 2019


DpHdlSoftCancel: cancel request for T812_U21476_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:26:59:255 2019


DpHdlSoftCancel: cancel request for T813_U21478_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:27:03:871 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:27:04:166 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
DpHdlSoftCancel: cancel request for T780_U21479_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T814_U21482_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:27:04:283 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:27:14:257 2019


DpHdlSoftCancel: cancel request for T816_U21485_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:27:23:871 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:27:34:274 2019


DpHdlSoftCancel: cancel request for T822_U21494_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T823_U21495_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:27:43:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:27:54:877 2019


DpHdlSoftCancel: cancel request for T826_U21502_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:27:59:878 2019


DpHdlSoftCancel: cancel request for T827_U21504_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T828_U21505_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:28:03:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:28:04:167 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
Sun Sep 22 05:28:04:883 2019
DpHdlSoftCancel: cancel request for T829_U21506_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T830_U21507_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:28:09:889 2019


DpHdlSoftCancel: cancel request for T856_U21556_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:28:20:777 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:28:23:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:28:29:900 2019


DpHdlSoftCancel: cancel request for T767_U21521_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:28:39:910 2019


DpHdlSoftCancel: cancel request for T855_U21555_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:28:40:795 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:28:43:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
Sun Sep 22 05:28:48:922 2019
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UPD now with high load, load / queue fill level = 0.905781 /
0.000000
DpSendLoadInfo: quota for load / queue fill level = 0.900000 / 5.000000
DpSendLoadInfo: queue UP2 now with high load, load / queue fill level = 0.903081 /
0.000000

Sun Sep 22 05:28:52:925 2019


DpSendLoadInfo: queue UPD no longer with high load
DpSendLoadInfo: queue UP2 no longer with high load

Sun Sep 22 05:28:54:993 2019


DpHdlSoftCancel: cancel request for T839_U21527_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T840_U21528_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:29:03:872 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:29:04:167 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:29:05:814 2019


DpHdlSoftCancel: cancel request for T843_U21535_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T841_U21530_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T842_U21531_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:29:15:821 2019


DpHdlSoftCancel: cancel request for T845_U21538_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:29:20:826 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:29:23:873 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:29:34:839 2019


DpHdlSoftCancel: cancel request for T836_U21546_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T851_U21547_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T852_U21548_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:29:39:844 2019


DpHdlSoftCancel: cancel request for T853_U21550_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:29:40:843 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:29:43:874 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:29:44:849 2019


DpHdlSoftCancel: cancel request for T854_U21551_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:29:49:852 2019


DpHdlSoftCancel: cancel request for T667_U21553_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:29:59:858 2019


DpHdlSoftCancel: cancel request for T858_U21559_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:30:00:863 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]
Sun Sep 22 05:30:03:874 2019
*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W0-4218
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W1-4221
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W5-4222
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W6-4223
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
DpWpDynCreate: created new work process W7-4224

Sun Sep 22 05:30:04:168 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:30:04:864 2019


DpHdlSoftCancel: cancel request for T861_U21562_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
DpHdlSoftCancel: cancel request for T860_U21561_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)
*** ERROR => DpHdlDeadWp: W0 (pid 4218) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4218) exited with exit code 255
DpWpRecoverMutex: recover resources of W0 (pid = 4218)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W1 (pid 4221) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4221) exited with exit code 255
DpWpRecoverMutex: recover resources of W1 (pid = 4221)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W5 (pid 4222) died (severity=0, status=65280) [dpxxwp.c
1463]
DpTraceWpStatus: child (pid=4222) exited with exit code 255
DpWpRecoverMutex: recover resources of W5 (pid = 4222)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W6 (pid 4223) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W6 (pid = 4223)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** ERROR => DpHdlDeadWp: W7 (pid 4224) died (severity=0, status=0) [dpxxwp.c
1463]
DpWpRecoverMutex: recover resources of W7 (pid = 4224)
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
Sun Sep 22 05:30:20:881 2019
***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:30:23:875 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W1 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W5 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W6 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot
*** WARNING => DpHdlDeadWp: wp_adm slot for W7 has no pid [dpxxwp.c 1373]
DpSkipSnapshot: last snapshot created at Sun Sep 22 05:25:43 2019, skip new
snapshot

Sun Sep 22 05:30:34:302 2019


DpHdlSoftCancel: cancel request for T876_U21583_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:30:39:307 2019


DpHdlSoftCancel: cancel request for T877_U21585_M0 received from ICMAN
(reason=DP_SOFTCANCEL_ICM_CONNECTION_CLOSED)

Sun Sep 22 05:30:40:898 2019


***LOG Q07=> DpCheckSession, bad tm ( -1) [dpxxdisp.c 1833]

Sun Sep 22 05:30:43:875 2019


*** WARNING => DpHdlDeadWp: wp_adm slot for W0 has no pid [dpxxwp.c 1373]

********** SERVER SNAPSHOT 189 (Reason: Workprocess 0 died / Time: Sun Sep 22
05:30:43 2019) - begin **********

Server smprd02_SMP_00, Sun Sep 22 05:30:43 2019

Scheduler info
--------------
WP info
DpNumberOfDiaWps: dia_wps 3, standby_wps 0
#dia = 3
#btc = 5
#standby = 0
#max = 21
General Scheduler info
preemptionInfo.isActive = true
preemptionInfo.timeslice = 500
preemptionInfo.checkLoad = true
Prio Class High
maxRuntime[RQ_Q_PRIO_HIGH] = 600 sec
maxRuntimeHalf[RQ_Q_PRIO_HIGH] = 300 sec
Prio Class Normal
maxRuntime[RQ_Q_PRIO_NORMAL] = 3600 sec
maxRuntimeHalf[RQ_Q_PRIO_NORMAL] = 1800 sec
maxTicketsForPrio[RQ_Q_PRIO_NORMAL] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_NORMAL] = 0
withPrioTickets[RQ_Q_PRIO_NORMAL] = true
Prio Class Low
maxRuntime[RQ_Q_PRIO_LOW] = infinite
maxRuntimeHalf[RQ_Q_PRIO_LOW] = infinite
maxTicketsForPrio[RQ_Q_PRIO_LOW] = 2
maxTicketsForPrioIncrement[RQ_Q_PRIO_LOW] = 0
withPrioTickets[RQ_Q_PRIO_LOW] = true
Actual tickets in use
actTicketsInUseForPrio[RQ_Q_PRIO_NORMAL] = 1
actTicketsInUseForPrio[RQ_Q_PRIO_LOW] = 1
Running requests[RQ_Q_PRIO_NORMAL] = 0
Running requests[RQ_Q_PRIO_LOW] = 1

Queue Statistics Sun Sep 22 05:30:43 2019


------------------------------------------------------------

Number of lost wakeup datagrams: 4

Max. number of queue elements : 14000

DIA : 2709 (peak 2710, writeCount 24106746, readCount 24104037)


UPD : 0 (peak 31, writeCount 4963, readCount 4963)
ENQ : 0 (peak 0, writeCount 0, readCount 0)
BTC : 0 (peak 65, writeCount 2125423, readCount 2125423)
SPO : 0 (peak 2, writeCount 25136, readCount 25136)
UP2 : 0 (peak 1, writeCount 2349, readCount 2349)
DISP: 0 (peak 67, writeCount 890475, readCount 890475)
GW : 0 (peak 49, writeCount 22411025, readCount 22411025)
ICM : 1 (peak 186, writeCount 391143, readCount 391142)
LWP : 1 (peak 16, writeCount 38318, readCount 38317)

Session queue dump (high priority, 0 elements, peak 39):


Session queue dump (normal priority, 887 elements, peak 887):
-1 <- 908 < T877_U21585_M0> -> 907
908 <- 907 < T876_U21583_M0> -> 750
907 <- 750 < T719_U21318_M0> -> 749
750 <- 749 < T718_U21317_M0> -> 751
749 <- 751 < T720_U21322_M0> -> 891
751 <- 891 < T860_U21561_M0> -> 892
891 <- 892 < T861_U21562_M0> -> 889
892 <- 889 < T858_U21559_M0> -> 697
889 <- 697 < T667_U21553_M0> -> 884
697 <- 884 < T853_U21550_M0> -> 671
884 <- 671 < T640_U21310_M0> -> 883
671 <- 883 < T852_U21548_M0> -> 882
883 <- 882 < T851_U21547_M0> -> 867
882 <- 867 < T836_U21546_M0> -> 742
867 <- 742 < T711_U21301_M0> -> 876
742 <- 876 < T845_U21538_M0> -> 741
876 <- 741 < T710_U21297_M0> -> 873
741 <- 873 < T842_U21531_M0> -> 872
873 <- 872 < T841_U21530_M0> -> 740
872 <- 740 < T709_U21296_M0> -> 739
740 <- 739 < T708_U21295_M0> -> 874
739 <- 874 < T843_U21535_M0> -> 736
874 <- 736 < T705_U21292_M0> -> 734
736 <- 734 < T703_U21289_M0> -> 735
734 <- 735 < T704_U21290_M0> -> 871
735 <- 871 < T840_U21528_M0> -> 870
871 <- 870 < T839_U21527_M0> -> 733
870 <- 733 < T702_U21285_M0> -> 732
733 <- 732 < T701_U21283_M0> -> 731
732 <- 731 < T700_U21282_M0> -> 730
731 <- 730 < T699_U21281_M0> -> 729
730 <- 729 < T698_U21280_M0> -> 798
729 <- 798 < T767_U21521_M0> -> 721
798 <- 721 < T690_U21266_M0> -> 722
721 <- 722 < T691_U21269_M0> -> 718
722 <- 718 < T687_U21262_M0> -> 720
718 <- 720 < T689_U21264_M0> -> 719
720 <- 719 < T688_U21263_M0> -> 887
719 <- 887 < T856_U21556_M0> -> 717
887 <- 717 < T686_U21261_M0> -> 713
717 <- 713 < T682_U21256_M0> -> 714
713 <- 714 < T683_U21257_M0> -> 861
714 <- 861 < T830_U21507_M0> -> 712
861 <- 712 < T681_U21255_M0> -> 860
712 <- 860 < T829_U21506_M0> -> 859
860 <- 859 < T828_U21505_M0> -> 858
859 <- 858 < T827_U21504_M0> -> 857
858 <- 857 < T826_U21502_M0> -> 711
857 <- 711 < T680_U21251_M0> -> 854
711 <- 854 < T823_U21495_M0> -> 853
854 <- 853 < T822_U21494_M0> -> 847
853 <- 847 < T816_U21485_M0> -> 700
847 <- 700 < T668_U21233_M0> -> 702
700 <- 702 < T670_U21234_M0> -> 698
702 <- 698 < T671_U21232_M0> -> 706
698 <- 706 < T675_U21230_M0> -> 845
706 <- 845 < T814_U21482_M0> -> 811
845 <- 811 < T780_U21479_M0> -> 844
811 <- 844 < T813_U21478_M0> -> 843
844 <- 843 < T812_U21476_M0> -> 703
843 <- 703 < T672_U21223_M0> -> 640
703 <- 640 < T609_U21473_M0> -> 692
640 <- 692 < T661_U21211_M0> -> 694
692 <- 694 < T663_U21213_M0> -> 693
694 <- 693 < T662_U21212_M0> -> 691
693 <- 691 < T660_U21210_M0> -> 690
691 <- 690 < T659_U21209_M0> -> 841
690 <- 841 < T810_U21471_M0> -> 840
841 <- 840 < T809_U21469_M0> -> 839
840 <- 839 < T808_U21468_M0> -> 838
839 <- 838 < T807_U21467_M0> -> 682
838 <- 682 < T651_U21197_M0> -> 681
682 <- 681 < T650_U21193_M0> -> 680
681 <- 680 < T649_U21192_M0> -> 679
680 <- 679 < T648_U21191_M0> -> 678
679 <- 678 < T647_U21190_M0> -> 677
678 <- 677 < T646_U21189_M0> -> 830
677 <- 830 < T799_U21455_M0> -> 676
830 <- 676 < T645_U21188_M0> -> 828
676 <- 828 < T797_U21452_M0> -> 673
828 <- 673 < T642_U21184_M0> -> 827
673 <- 827 < T796_U21451_M0> -> 674
827 <- 674 < T643_U21185_M0> -> 825
674 <- 825 < T794_U21448_M0> -> 823
825 <- 823 < T792_U21445_M0> -> 822
823 <- 822 < T791_U21444_M0> -> 786
822 <- 786 < T755_U21443_M0> -> 670
786 <- 670 < T639_U21178_M0> -> 764
670 <- 764 < T733_U21340_M0> -> 821
764 <- 821 < T790_U21439_M0> -> 664
821 <- 664 < T633_U21167_M0> -> 662
664 <- 662 < T631_U21162_M0> -> 661
662 <- 661 < T630_U21161_M0> -> 801
661 <- 801 < T774_U21428_M0> -> 800
801 <- 800 < T769_U21426_M0> -> 659
800 <- 659 < T628_U21159_M0> -> 658
659 <- 658 < T627_U21157_M0> -> 799
658 <- 799 < T770_U21420_M0> -> 804
799 <- 804 < T771_U21421_M0> -> 814
804 <- 814 < T783_U21416_M0> -> 812
814 <- 812 < T781_U21411_M0> -> 770
812 <- 770 < T739_U21349_M0> -> 647
770 <- 647 < T616_U21135_M0> -> 591
647 <- 591 < T560_U21036_M0> -> 589
591 <- 589 < T558_U21031_M0> -> 544
589 <- 544 < T513_U20966_M0> -> 528
544 <- 528 < T497_U20937_M0> -> 492
528 <- 492 < T461_U20878_M0> -> 489
492 <- 489 < T458_U20872_M0> -> 457
489 <- 457 < T426_U20813_M0> -> 434
457 <- 434 < T404_U20779_M0> -> 395
434 <- 395 < T364_U20713_M0> -> 371
395 <- 371 < T342_U20674_M0> -> 354
371 <- 354 < T323_U20621_M0> -> 346
354 <- 346 < T315_U20602_M0> -> 344
346 <- 344 < T313_U20596_M0> -> 319
344 <- 319 < T288_U20578_M0> -> 291
319 <- 291 < T260_U20481_M0> -> 286
291 <- 286 < T255_U20475_M0> -> 284
286 <- 284 < T253_U20470_M0> -> 282
284 <- 282 < T251_U20559_M0> -> 231
282 <- 231 < T200_U20354_M0> -> 228
231 <- 228 < T197_U20347_M0> -> 221
228 <- 221 < T190_U20350_M0> -> 62
221 <- 62 < T121_U20226_M0> -> 78
62 <- 78 < T88_U20160_M0> -> 69
78 <- 69 < T61_U20113_M0> -> 100
69 <- 100 < T38_U20216_M0> -> 54
100 <- 54 < T23_U20175_M0> -> 89
54 <- 89 < T12_U20220_M0> -> 657
89 <- 657 < T626_U21152_M0> -> 809
657 <- 809 < T778_U21407_M0> -> 806
809 <- 806 < T775_U21404_M0> -> 808
806 <- 808 < T777_U21406_M0> -> 807
808 <- 807 < T776_U21405_M0> -> 810
807 <- 810 < T779_U21409_M0> -> 649
810 <- 649 < T618_U21138_M0> -> 648
649 <- 648 < T617_U21136_M0> -> 788
648 <- 788 < T757_U21380_M0> -> 789
788 <- 789 < T758_U21382_M0> -> 790
789 <- 790 < T759_U21384_M0> -> 644
790 <- 644 < T613_U21131_M0> -> 643
644 <- 643 < T612_U21130_M0> -> 642
643 <- 642 < T611_U21375_M0> -> 785
642 <- 785 < T754_U21377_M0> -> 641
785 <- 641 < T610_U21125_M0> -> 813
641 <- 813 < T782_U21415_M0> -> 639
813 <- 639 < T608_U21122_M0> -> 638
639 <- 638 < T607_U21121_M0> -> 628
638 <- 628 < T597_U21120_M0> -> 557
628 <- 557 < T526_U21370_M0> -> 630
557 <- 630 < T599_U21109_M0> -> 629
630 <- 629 < T598_U21108_M0> -> 778
629 <- 778 < T747_U21361_M0> -> 626
778 <- 626 < T595_U21104_M0> -> 627
626 <- 627 < T596_U21105_M0> -> 625
627 <- 625 < T594_U21102_M0> -> 777
625 <- 777 < T746_U21358_M0> -> 775
777 <- 775 < T744_U21355_M0> -> 622
775 <- 622 < T591_U21098_M0> -> 608
622 <- 608 < T577_U21097_M0> -> 623
608 <- 623 < T592_U21099_M0> -> 774
623 <- 774 < T743_U21354_M0> -> 776
774 <- 776 < T745_U21356_M0> -> 815
776 <- 815 < T784_U21417_M0> -> 771
815 <- 771 < T740_U21351_M0> -> 773
771 <- 773 < T742_U21353_M0> -> 772
773 <- 772 < T741_U21352_M0> -> 769
772 <- 769 < T738_U21348_M0> -> 767
769 <- 767 < T736_U21346_M0> -> 621
767 <- 621 < T590_U21094_M0> -> 765
621 <- 765 < T734_U21341_M0> -> 763
765 <- 763 < T732_U21339_M0> -> 620
763 <- 620 < T589_U21092_M0> -> 761
620 <- 761 < T730_U21336_M0> -> 762
761 <- 762 < T731_U21337_M0> -> 760
762 <- 760 < T729_U21335_M0> -> 759
760 <- 759 < T728_U21334_M0> -> 758
759 <- 758 < T727_U21333_M0> -> 611
758 <- 611 < T580_U21079_M0> -> 610
611 <- 610 < T579_U21076_M0> -> 607
610 <- 607 < T576_U21073_M0> -> 606
607 <- 606 < T575_U21072_M0> -> 555
606 <- 555 < T524_U21071_M0> -> 605
555 <- 605 < T574_U21067_M0> -> 604
605 <- 604 < T573_U21066_M0> -> 602
604 <- 602 < T571_U21063_M0> -> 601
602 <- 601 < T570_U21062_M0> -> 600
601 <- 600 < T569_U21061_M0> -> 585
600 <- 585 < T556_U21043_M0> -> 586
585 <- 586 < T555_U21042_M0> -> 584
586 <- 584 < T553_U21041_M0> -> 583
584 <- 583 < T552_U21048_M0> -> 582
583 <- 582 < T551_U21047_M0> -> 587
582 <- 587 < T554_U21040_M0> -> 592
587 <- 592 < T561_U21037_M0> -> 579
592 <- 579 < T548_U21020_M0> -> 568
579 <- 568 < T537_U21002_M0> -> 569
568 <- 569 < T538_U21004_M0> -> 567
569 <- 567 < T536_U21001_M0> -> 566
567 <- 566 < T535_U21000_M0> -> 553
566 <- 553 < T523_U20998_M0> -> 564
553 <- 564 < T533_U20994_M0> -> 563
564 <- 563 < T532_U20992_M0> -> 239
563 <- 239 < T208_U20991_M0> -> 377
239 <- 377 < T346_U20990_M0> -> 548
377 <- 548 < T517_U20971_M0> -> 546
548 <- 546 < T515_U20968_M0> -> 547
546 <- 547 < T516_U20969_M0> -> 549
547 <- 549 < T518_U20974_M0> -> 543
549 <- 543 < T512_U20965_M0> -> 542
543 <- 542 < T511_U20963_M0> -> 541
542 <- 541 < T510_U20962_M0> -> 540
541 <- 540 < T509_U20958_M0> -> 532
540 <- 532 < T501_U20942_M0> -> 530
532 <- 530 < T499_U20939_M0> -> 529
530 <- 529 < T498_U20938_M0> -> 527
529 <- 527 < T496_U20933_M0> -> 526
527 <- 526 < T495_U20931_M0> -> 514
526 <- 514 < T483_U20929_M0> -> 525
514 <- 525 < T494_U20930_M0> -> 518
525 <- 518 < T487_U20919_M0> -> 478
518 <- 478 < T447_U20910_M0> -> 513
478 <- 513 < T482_U20914_M0> -> 512
513 <- 512 < T481_U20912_M0> -> 495
512 <- 495 < T464_U20911_M0> -> 331
495 <- 331 < T300_U20907_M0> -> 511
331 <- 511 < T480_U20909_M0> -> 508
511 <- 508 < T477_U20902_M0> -> 507
508 <- 507 < T476_U20901_M0> -> 506
507 <- 506 < T475_U20899_M0> -> 497
506 <- 497 < T466_U20886_M0> -> 496
497 <- 496 < T465_U20884_M0> -> 494
496 <- 494 < T463_U20881_M0> -> 493
494 <- 493 < T462_U20880_M0> -> 429
493 <- 429 < T398_U20877_M0> -> 490
429 <- 490 < T459_U20873_M0> -> 488
490 <- 488 < T457_U20870_M0> -> 487
488 <- 487 < T456_U20869_M0> -> 486
487 <- 486 < T455_U20868_M0> -> 485
486 <- 485 < T454_U20867_M0> -> 471
485 <- 471 < T444_U20853_M0> -> 474
471 <- 474 < T443_U20850_M0> -> 473
474 <- 473 < T442_U20851_M0> -> 476
473 <- 476 < T441_U20847_M0> -> 479
476 <- 479 < T448_U20845_M0> -> 477
479 <- 477 < T446_U20840_M0> -> 462
477 <- 462 < T431_U20819_M0> -> 460
462 <- 460 < T429_U20817_M0> -> 458
460 <- 458 < T427_U20814_M0> -> 454
458 <- 454 < T423_U20807_M0> -> 452
454 <- 452 < T421_U20804_M0> -> 451
452 <- 451 < T420_U20803_M0> -> 301
451 <- 301 < T270_U20802_M0> -> 298
301 <- 298 < T267_U20801_M0> -> 445
298 <- 445 < T414_U20793_M0> -> 444
445 <- 444 < T413_U20790_M0> -> 443
444 <- 443 < T412_U20789_M0> -> 441
443 <- 441 < T410_U20786_M0> -> 440
441 <- 440 < T409_U20785_M0> -> 438
440 <- 438 < T407_U20783_M0> -> 436
438 <- 436 < T405_U20781_M0> -> 431
436 <- 431 < T400_U20775_M0> -> 435
431 <- 435 < T403_U20778_M0> -> 432
435 <- 432 < T401_U20776_M0> -> 430
432 <- 430 < T399_U20774_M0> -> 428
430 <- 428 < T397_U20769_M0> -> 421
428 <- 421 < T390_U20755_M0> -> 418
421 <- 418 < T387_U20751_M0> -> 420
418 <- 420 < T389_U20754_M0> -> 417
420 <- 417 < T386_U20750_M0> -> 414
417 <- 414 < T383_U20742_M0> -> 413
414 <- 413 < T382_U20741_M0> -> 412
413 <- 412 < T381_U20740_M0> -> 411
412 <- 411 < T380_U20739_M0> -> 404
411 <- 404 < T373_U20738_M0> -> 405
404 <- 405 < T374_U20729_M0> -> 402
405 <- 402 < T371_U20725_M0> -> 401
402 <- 401 < T370_U20724_M0> -> 403
401 <- 403 < T372_U20727_M0> -> 379
403 <- 379 < T348_U20723_M0> -> 398
379 <- 398 < T367_U20720_M0> -> 400
398 <- 400 < T369_U20722_M0> -> 399
400 <- 399 < T368_U20721_M0> -> 369
399 <- 369 < T338_U20717_M0> -> 397
369 <- 397 < T366_U20718_M0> -> 393
397 <- 393 < T362_U20711_M0> -> 394
393 <- 394 < T363_U20712_M0> -> 387
394 <- 387 < T356_U20700_M0> -> 386
387 <- 386 < T355_U20698_M0> -> 385
386 <- 385 < T354_U20697_M0> -> 384
385 <- 384 < T353_U20694_M0> -> 372
384 <- 372 < T343_U20691_M0> -> 245
372 <- 245 < T213_U20687_M0> -> 373
245 <- 373 < T345_U20679_M0> -> 374
373 <- 374 < T340_U20677_M0> -> 375
374 <- 375 < T344_U20676_M0> -> 382
375 <- 382 < T351_U20671_M0> -> 381
382 <- 381 < T350_U20670_M0> -> 378
381 <- 378 < T347_U20664_M0> -> 368
378 <- 368 < T337_U20653_M0> -> 367
368 <- 367 < T336_U20648_M0> -> 366
367 <- 366 < T335_U20645_M0> -> 364
366 <- 364 < T333_U20643_M0> -> 363
364 <- 363 < T332_U20642_M0> -> 361
363 <- 361 < T330_U20639_M0> -> 248
361 <- 248 < T214_U20638_M0> -> 327
248 <- 327 < T296_U20636_M0> -> 360
327 <- 360 < T329_U20632_M0> -> 357
360 <- 357 < T326_U20626_M0> -> 358
357 <- 358 < T327_U20627_M0> -> 356
358 <- 356 < T325_U20624_M0> -> 355
356 <- 355 < T324_U20622_M0> -> 222
355 <- 222 < T191_U20619_M0> -> 352
222 <- 352 < T321_U20616_M0> -> 350
352 <- 350 < T319_U20613_M0> -> 348
350 <- 348 < T317_U20605_M0> -> 347
348 <- 347 < T316_U20603_M0> -> 349
347 <- 349 < T318_U20606_M0> -> 343
349 <- 343 < T312_U20595_M0> -> 342
343 <- 342 < T311_U20594_M0> -> 341
342 <- 341 < T310_U20592_M0> -> 340
341 <- 340 < T309_U20587_M0> -> 338
340 <- 338 < T307_U20585_M0> -> 318
338 <- 318 < T287_U20582_M0> -> 337
318 <- 337 < T306_U20581_M0> -> 336
337 <- 336 < T305_U20580_M0> -> 334
336 <- 334 < T303_U20574_M0> -> 335
334 <- 335 < T304_U20575_M0> -> 333
335 <- 333 < T302_U20568_M0> -> 332
333 <- 332 < T301_U20567_M0> -> 330
332 <- 330 < T299_U20563_M0> -> 328
330 <- 328 < T297_U20560_M0> -> 325
328 <- 325 < T294_U20553_M0> -> 326
325 <- 326 < T295_U20554_M0> -> 309
326 <- 309 < T280_U20543_M0> -> 322
309 <- 322 < T291_U20546_M0> -> 312
322 <- 312 < T283_U20541_M0> -> 310
312 <- 310 < T282_U20544_M0> -> 323
310 <- 323 < T292_U20547_M0> -> 308
323 <- 308 < T276_U20537_M0> -> 314
308 <- 314 < T278_U20536_M0> -> 277
314 <- 277 < T246_U20530_M0> -> 320
277 <- 320 < T289_U20532_M0> -> 317
320 <- 317 < T286_U20526_M0> -> 316
317 <- 316 < T285_U20525_M0> -> 315
316 <- 315 < T284_U20523_M0> -> 306
315 <- 306 < T275_U20514_M0> -> 276
306 <- 276 < T245_U20504_M0> -> 303
276 <- 303 < T272_U20501_M0> -> 300
303 <- 300 < T269_U20497_M0> -> 299
300 <- 299 < T268_U20492_M0> -> 297
299 <- 297 < T266_U20489_M0> -> 296
297 <- 296 < T265_U20488_M0> -> 292
296 <- 292 < T261_U20483_M0> -> 294
292 <- 294 < T263_U20485_M0> -> 295
294 <- 295 < T264_U20486_M0> -> 293
295 <- 293 < T262_U20484_M0> -> 290
293 <- 290 < T259_U20480_M0> -> 289
290 <- 289 < T258_U20479_M0> -> 288
289 <- 288 < T257_U20478_M0> -> 281
288 <- 281 < T250_U20467_M0> -> 274
281 <- 274 < T243_U20466_M0> -> 275
274 <- 275 < T244_U20464_M0> -> 280
275 <- 280 < T249_U20458_M0> -> 271
280 <- 271 < T240_U20455_M0> -> 135
271 <- 135 < T71_U20451_M0> -> 278
135 <- 278 < T247_U20447_M0> -> 273
278 <- 273 < T242_U20441_M0> -> 272
273 <- 272 < T241_U20440_M0> -> 241
272 <- 241 < T210_U20438_M0> -> 270
241 <- 270 < T239_U20436_M0> -> 268
270 <- 268 < T237_U20433_M0> -> 258
268 <- 258 < T227_U20432_M0> -> 267
258 <- 267 < T236_U20429_M0> -> 265
267 <- 265 < T234_U20427_M0> -> 266
265 <- 266 < T235_U20428_M0> -> 264
266 <- 264 < T233_U20426_M0> -> 263
264 <- 263 < T232_U20424_M0> -> 262
263 <- 262 < T231_U20419_M0> -> 261
262 <- 261 < T230_U20417_M0> -> 260
261 <- 260 < T229_U20416_M0> -> 229
260 <- 229 < T198_U20411_M0> -> 259
229 <- 259 < T228_U20415_M0> -> 257
259 <- 257 < T226_U20412_M0> -> 119
257 <- 119 < T11_U20406_M0> -> 242
119 <- 242 < T212_U20408_M0> -> 247
242 <- 247 < T211_U20403_M0> -> 246
247 <- 246 < T215_U20400_M0> -> 255
246 <- 255 < T224_U20389_M0> -> 249
255 <- 249 < T216_U20391_M0> -> 253
249 <- 253 < T222_U20386_M0> -> 252
253 <- 252 < T221_U20383_M0> -> 240
252 <- 240 < T209_U20370_M0> -> 250
240 <- 250 < T219_U20381_M0> -> 237
250 <- 237 < T206_U20362_M0> -> 238
237 <- 238 < T207_U20363_M0> -> 236
238 <- 236 < T205_U20360_M0> -> 234
236 <- 234 < T203_U20358_M0> -> 235
234 <- 235 < T204_U20359_M0> -> 232
235 <- 232 < T201_U20355_M0> -> 233
232 <- 233 < T202_U20357_M0> -> 230
233 <- 230 < T199_U20353_M0> -> 192
230 <- 192 < T161_U20349_M0> -> 226
192 <- 226 < T195_U20345_M0> -> 225
226 <- 225 < T194_U20343_M0> -> 224
225 <- 224 < T193_U20341_M0> -> 220
224 <- 220 < T189_U20331_M0> -> 219
220 <- 219 < T188_U20326_M0> -> 218
219 <- 218 < T187_U20325_M0> -> 213
218 <- 213 < T182_U20314_M0> -> 216
213 <- 216 < T185_U20317_M0> -> 212
216 <- 212 < T181_U20313_M0> -> 215
212 <- 215 < T184_U20316_M0> -> 211
215 <- 211 < T180_U20310_M0> -> 210
211 <- 210 < T179_U20309_M0> -> 206
210 <- 206 < T175_U20304_M0> -> 207
206 <- 207 < T176_U20305_M0> -> 208
207 <- 208 < T177_U20306_M0> -> 205
208 <- 205 < T174_U20300_M0> -> 203
205 <- 203 < T172_U20298_M0> -> 204
203 <- 204 < T173_U20299_M0> -> 202
204 <- 202 < T171_U20295_M0> -> 201
202 <- 201 < T170_U20291_M0> -> 200
201 <- 200 < T169_U20289_M0> -> 199
200 <- 199 < T168_U20288_M0> -> 196
199 <- 196 < T165_U20284_M0> -> 195
196 <- 195 < T164_U20283_M0> -> 103
195 <- 103 < T145_U20282_M0> -> 193
103 <- 193 < T162_U20279_M0> -> 91
193 <- 91 < T19_U20278_M0> -> 189
91 <- 189 < T158_U20273_M0> -> 191
189 <- 191 < T160_U20274_M0> -> 182
191 <- 182 < T151_U20265_M0> -> 176
182 <- 176 < T157_U20264_M0> -> 190
176 <- 190 < T159_U20259_M0> -> 158
190 <- 158 < T154_U20261_M0> -> 184
158 <- 184 < T135_U20255_M0> -> 180
184 <- 180 < T143_U20250_M0> -> 154
180 <- 154 < T150_U20251_M0> -> 188
154 <- 188 < T125_U20235_M0> -> 131
188 <- 131 < T137_U20236_M0> -> 160
131 <- 160 < T148_U20238_M0> -> 175
160 <- 175 < T144_U20241_M0> -> 183
175 <- 183 < T152_U20240_M0> -> 152
183 <- 152 < T78_U20239_M0> -> 146
152 <- 146 < T132_U20225_M0> -> 166
146 <- 166 < T156_U20224_M0> -> 76
166 <- 76 < T20_U20221_M0> -> 138
76 <- 138 < T106_U20213_M0> -> 133
138 <- 133 < T79_U20212_M0> -> 157
133 <- 157 < T102_U20210_M0> -> 74
157 <- 74 < T42_U20205_M0> -> 56
74 <- 56 < T35_U20203_M0> -> 59
56 <- 59 < T25_U20202_M0> -> 45
59 <- 45 < T76_U20199_M0> -> 39
45 <- 39 < T100_U20195_M0> -> 122
39 <- 122 < T92_U20196_M0> -> 171
122 <- 171 < T21_U20194_M0> -> 148
171 <- 148 < T123_U20185_M0> -> 153
148 <- 153 < T129_U20184_M0> -> 147
153 <- 147 < T115_U20182_M0> -> 170
147 <- 170 < T93_U20181_M0> -> 129
170 <- 129 < T73_U20178_M0> -> 144
129 <- 144 < T53_U20177_M0> -> 58
144 <- 58 < T77_U20172_M0> -> 117
58 <- 117 < T31_U20168_M0> -> 60
117 <- 60 < T15_U20162_M0> -> 124
60 <- 124 < T91_U20163_M0> -> 163
124 <- 163 < T133_U20159_M0> -> 70
163 <- 70 < T108_U20154_M0> -> 114
70 <- 114 < T89_U20156_M0> -> 118
114 <- 118 < T58_U20151_M0> -> 33
118 <- 33 < T14_U20145_M0> -> 95
33 <- 95 < T149_U20149_M0> -> 41
95 <- 41 < T64_U20148_M0> -> 105
41 <- 105 < T1_U20146_M0> -> 125
105 <- 125 < T84_U20142_M0> -> 97
125 <- 97 < T9_U20138_M0> -> 161
97 <- 161 < T50_U20135_M0> -> 42
161 <- 42 < T120_U20137_M0> -> 94
42 <- 94 < T40_U20136_M0> -> 177
94 <- 177 < T90_U20133_M0> -> 149
177 <- 149 < T37_U20132_M0> -> 167
149 <- 167 < T16_U20130_M0> -> 93
167 <- 93 < T70_U20109_M0> -> 86
93 <- 86 < T32_U20104_M0> -> 63
86 <- 63 < T41_U20100_M0> -> 77
63 <- 77 < T0_U20103_M0> -> 52
77 <- 52 < T26_U20099_M0> -> 132
52 <- 132 < T140_U20098_M0> -> 31
132 <- 31 < T138_U20086_M0> -> 108
31 <- 108 < T6_U20079_M0> -> 172
108 <- 172 < T85_U20077_M0> -> 65
172 <- 65 < T29_U20080_M0> -> 53
65 <- 53 < T124_U20081_M0> -> 130
53 <- 130 < T43_U20076_M0> -> 47
130 <- 47 < T4_U20075_M0> -> 169
47 <- 169 < T28_U20082_M0> -> 64
169 <- 64 < T111_U20073_M0> -> 142
64 <- 142 < T48_U20071_M0> -> 85
142 <- 85 < T117_U20074_M0> -> 151
85 <- 151 < T7_U20048_M0> -> 137
151 <- 137 < T134_U20046_M0> -> 102
137 <- 102 < T62_U20045_M0> -> 80
102 <- 80 < T5_U20044_M0> -> 46
80 <- 46 < T116_U20039_M0> -> 186
46 <- 186 < T27_U20043_M0> -> 81
186 <- 81 < T101_U20038_M0> -> 110
81 <- 110 < T94_U20042_M0> -> 116
110 <- 116 < T2_U20040_M0> -> 155
116 <- 155 < T81_U20034_M0> -> 115
155 <- 115 < T46_U20023_M0> -> 37
115 <- 37 < T66_U20030_M0> -> 173
37 <- 173 < T60_U20028_M0> -> 35
173 <- 35 < T63_U20029_M0> -> 49
35 <- 49 < T13_U20026_M0> -> 150
49 <- 150 < T56_U20025_M0> -> 101
150 <- 101 < T34_U20024_M0> -> 68
101 <- 68 < T83_U20022_M0> -> 72
68 <- 72 < T99_U20032_M0> -> 112
72 <- 112 < T110_U20031_M0> -> 43
112 <- 43 < T18_U20020_M0> -> 92
43 <- 92 < T8_U20017_M0> -> 156
92 <- 156 < T39_U20016_M0> -> 66
156 <- 66 < T114_U20014_M0> -> 113
66 <- 113 < T82_U20013_M0> -> 32
113 <- 32 < T107_U19987_M0> -> 143
32 <- 143 < T17_U19983_M0> -> 109
143 <- 109 < T153_U19985_M0> -> 79
109 <- 79 < T105_U19973_M0> -> 107
79 <- 107 < T119_U19971_M0> -> 106
107 <- 106 < T49_U19974_M0> -> 87
106 <- 87 < T54_U19975_M0> -> 134
87 <- 134 < T52_U19972_M0> -> 36
134 <- 36 < T68_U19968_M0> -> 83
36 <- 83 < T113_U19967_M0> -> 84
83 <- 84 < T44_U19965_M0> -> 98
84 <- 98 < T127_U19966_M0> -> 145
98 <- 145 < T67_U19964_M0> -> 73
145 <- 73 < T65_U19963_M0> -> 50
73 <- 50 < T80_U19969_M0> -> 120
50 <- 120 < T118_U22844_M0> -> 141
120 <- 141 < T57_U19917_M0> -> 104
141 <- 104 < T74_U19995_M0> -> 48
104 <- 48 < T97_U19999_M0> -> 38
48 <- 38 < T142_U20002_M0> -> 57
38 <- 57 < T3_U20015_M0> -> 44
57 <- 44 < T24_U20019_M0> -> 67
44 <- 67 < T122_U20050_M0> -> 185
67 <- 185 < T86_U20069_M0> -> 61
185 <- 61 < T104_U20107_M0> -> 99
61 <- 99 < T139_U20112_M0> -> 55
99 <- 55 < T96_U20125_M0> -> 181
55 <- 181 < T87_U19196_M0> -> 159
181 <- 159 < T128_U20140_M0> -> 121
159 <- 121 < T126_U18279_M0> -> 123
121 <- 123 < T95_U20155_M0> -> 179
123 <- 179 < T136_U20158_M0> -> 82
179 <- 82 < T141_U20176_M0> -> 127
82 <- 127 < T45_U20180_M0> -> 51
127 <- 51 < T98_U18282_M0> -> 40
51 <- 40 < T51_U20215_M0> -> 71
40 <- 71 < T36_U20219_M0> -> 168
71 <- 168 < T146_U20253_M0> -> 128
168 <- 128 < T155_U20258_M0> -> 75
128 <- 75 < T33_U20262_M0> -> 178
75 <- 178 < T130_U20267_M0> -> 194
178 <- 194 < T163_U20281_M0> -> 197
194 <- 197 < T166_U20285_M0> -> 198
197 <- 198 < T167_U20303_M0> -> 209
198 <- 209 < T178_U20308_M0> -> 214
209 <- 214 < T183_U20315_M0> -> 217
214 <- 217 < T186_U20320_M0> -> 187
217 <- 187 < T112_U20329_M0> -> 174
187 <- 174 < T131_U20333_M0> -> 223
174 <- 223 < T192_U20338_M0> -> 227
223 <- 227 < T196_U20346_M0> -> 251
227 <- 251 < T220_U20382_M0> -> 254
251 <- 254 < T223_U20387_M0> -> 244
254 <- 244 < T218_U20394_M0> -> 243
244 <- 243 < T217_U20407_M0> -> 256
243 <- 256 < T225_U20410_M0> -> 269
256 <- 269 < T238_U20435_M0> -> 279
269 <- 279 < T248_U20456_M0> -> 283
279 <- 283 < T252_U20469_M0> -> 285
283 <- 285 < T254_U20474_M0> -> 302
285 <- 302 < T271_U20500_M0> -> 304
302 <- 304 < T273_U20507_M0> -> 305
304 <- 305 < T274_U20509_M0> -> 321
305 <- 321 < T290_U20533_M0> -> 313
321 <- 313 < T281_U20535_M0> -> 311
313 <- 311 < T279_U20542_M0> -> 165
311 <- 165 < T75_U20558_M0> -> 329
165 <- 329 < T298_U20562_M0> -> 339
329 <- 339 < T308_U20586_M0> -> 345
339 <- 345 < T314_U20597_M0> -> 307
345 <- 307 < T277_U20600_M0> -> 351
307 <- 351 < T320_U20615_M0> -> 353
351 <- 353 < T322_U20620_M0> -> 359
353 <- 359 < T328_U20630_M0> -> 362
359 <- 362 < T331_U20641_M0> -> 380
362 <- 380 < T349_U20669_M0> -> 376
380 <- 376 < T339_U20673_M0> -> 370
376 <- 370 < T341_U20690_M0> -> 365
370 <- 365 < T334_U20692_M0> -> 388
365 <- 388 < T357_U20702_M0> -> 389
388 <- 389 < T358_U20703_M0> -> 390
389 <- 390 < T359_U20704_M0> -> 391
390 <- 391 < T360_U20705_M0> -> 392
391 <- 392 < T361_U20707_M0> -> 396
392 <- 396 < T365_U20714_M0> -> 383
396 <- 383 < T352_U20716_M0> -> 406
383 <- 406 < T375_U20731_M0> -> 407
406 <- 407 < T376_U20732_M0> -> 408
407 <- 408 < T377_U20733_M0> -> 409
408 <- 409 < T378_U20734_M0> -> 410
409 <- 410 < T379_U20736_M0> -> 416
410 <- 416 < T385_U20745_M0> -> 287
416 <- 287 < T256_U20748_M0> -> 419
287 <- 419 < T388_U20752_M0> -> 422
419 <- 422 < T391_U20758_M0> -> 423
422 <- 423 < T392_U20760_M0> -> 424
423 <- 424 < T393_U20761_M0> -> 425
424 <- 425 < T394_U20762_M0> -> 426
425 <- 426 < T395_U20763_M0> -> 427
426 <- 427 < T396_U20764_M0> -> 433
427 <- 433 < T402_U20777_M0> -> 437
433 <- 437 < T406_U20782_M0> -> 439
437 <- 439 < T408_U20784_M0> -> 446
439 <- 446 < T415_U20795_M0> -> 447
446 <- 447 < T416_U20796_M0> -> 448
447 <- 448 < T417_U20797_M0> -> 449
448 <- 449 < T418_U20798_M0> -> 450
449 <- 450 < T419_U20799_M0> -> 324
450 <- 324 < T293_U20809_M0> -> 455
324 <- 455 < T424_U20810_M0> -> 456
455 <- 456 < T425_U20812_M0> -> 461
456 <- 461 < T430_U20818_M0> -> 463
461 <- 463 < T432_U20821_M0> -> 464
463 <- 464 < T433_U20823_M0> -> 465
464 <- 465 < T434_U20824_M0> -> 466
465 <- 466 < T435_U20825_M0> -> 467
466 <- 467 < T436_U20827_M0> -> 468
467 <- 468 < T437_U20828_M0> -> 459
468 <- 459 < T428_U20844_M0> -> 472
459 <- 472 < T440_U20854_M0> -> 470
472 <- 470 < T439_U20856_M0> -> 480
470 <- 480 < T449_U20859_M0> -> 481
480 <- 481 < T450_U20860_M0> -> 482
481 <- 482 < T451_U20861_M0> -> 483
482 <- 483 < T452_U20863_M0> -> 484
483 <- 484 < T453_U20864_M0> -> 491
484 <- 491 < T460_U20874_M0> -> 415
491 <- 415 < T384_U20876_M0> -> 498
415 <- 498 < T467_U20887_M0> -> 499
498 <- 499 < T468_U20888_M0> -> 500
499 <- 500 < T469_U20890_M0> -> 501
500 <- 501 < T470_U20891_M0> -> 502
501 <- 502 < T471_U20892_M0> -> 503
502 <- 503 < T472_U20893_M0> -> 504
503 <- 504 < T473_U20896_M0> -> 505
504 <- 505 < T474_U20897_M0> -> 509
505 <- 509 < T478_U20903_M0> -> 510
509 <- 510 < T479_U20908_M0> -> 515
510 <- 515 < T484_U20916_M0> -> 516
515 <- 516 < T485_U20917_M0> -> 517
516 <- 517 < T486_U20918_M0> -> 519
517 <- 519 < T488_U20921_M0> -> 520
519 <- 520 < T489_U20922_M0> -> 521
520 <- 521 < T490_U20923_M0> -> 522
521 <- 522 < T491_U20924_M0> -> 523
522 <- 523 < T492_U20926_M0> -> 524
523 <- 524 < T493_U20927_M0> -> 475
524 <- 475 < T445_U20936_M0> -> 531
475 <- 531 < T500_U20941_M0> -> 533
531 <- 533 < T502_U20946_M0> -> 534
533 <- 534 < T503_U20947_M0> -> 535
534 <- 535 < T504_U20949_M0> -> 536
535 <- 536 < T505_U20950_M0> -> 537
536 <- 537 < T506_U20951_M0> -> 538
537 <- 538 < T507_U20952_M0> -> 539
538 <- 539 < T508_U20953_M0> -> 469
539 <- 469 < T438_U20961_M0> -> 545
469 <- 545 < T514_U20967_M0> -> 550
545 <- 550 < T519_U20975_M0> -> 551
550 <- 551 < T520_U20976_M0> -> 552
551 <- 552 < T521_U20977_M0> -> 558
552 <- 558 < T527_U20984_M0> -> 559
558 <- 559 < T528_U20985_M0> -> 560
559 <- 560 < T529_U20986_M0> -> 561
560 <- 561 < T530_U20987_M0> -> 562
561 <- 562 < T531_U20988_M0> -> 554
562 <- 554 < T522_U20997_M0> -> 570
554 <- 570 < T539_U21007_M0> -> 571
570 <- 571 < T540_U21008_M0> -> 572
571 <- 572 < T541_U21009_M0> -> 573
572 <- 573 < T542_U21010_M0> -> 574
573 <- 574 < T543_U21012_M0> -> 575
574 <- 575 < T544_U21013_M0> -> 576
575 <- 576 < T545_U21014_M0> -> 577
576 <- 577 < T546_U21015_M0> -> 578
577 <- 578 < T547_U21016_M0> -> 590
578 <- 590 < T559_U21035_M0> -> 581
590 <- 581 < T550_U21049_M0> -> 593
581 <- 593 < T562_U21050_M0> -> 595
593 <- 595 < T564_U21053_M0> -> 596
595 <- 596 < T565_U21054_M0> -> 597
596 <- 597 < T566_U21055_M0> -> 598
597 <- 598 < T567_U21056_M0> -> 599
598 <- 599 < T568_U21057_M0> -> 556
599 <- 556 < T525_U21069_M0> -> 609
556 <- 609 < T578_U21075_M0> -> 612
609 <- 612 < T581_U21080_M0> -> 613
612 <- 613 < T582_U21081_M0> -> 614
613 <- 614 < T583_U21082_M0> -> 615
614 <- 615 < T584_U21084_M0> -> 616
615 <- 616 < T585_U21085_M0> -> 617
616 <- 617 < T586_U21086_M0> -> 618
617 <- 618 < T587_U21088_M0> -> 619
618 <- 619 < T588_U21089_M0> -> 565
619 <- 565 < T534_U21096_M0> -> 624
565 <- 624 < T593_U21101_M0> -> 580
624 <- 580 < T549_U21103_M0> -> 631
580 <- 631 < T600_U21110_M0> -> 633
631 <- 633 < T602_U21113_M0> -> 634
633 <- 634 < T603_U21114_M0> -> 635
634 <- 635 < T604_U21115_M0> -> 636
635 <- 636 < T605_U21117_M0> -> 637
636 <- 637 < T606_U21118_M0> -> 588
637 <- 588 < T557_U21129_M0> -> 645
588 <- 645 < T614_U21133_M0> -> 646
645 <- 646 < T615_U21134_M0> -> 650
646 <- 650 < T619_U21141_M0> -> 651
650 <- 651 < T620_U21142_M0> -> 652
651 <- 652 < T621_U21144_M0> -> 653
652 <- 653 < T622_U21145_M0> -> 654
653 <- 654 < T623_U21146_M0> -> 655
654 <- 655 < T624_U21147_M0> -> 656
655 <- 656 < T625_U21148_M0> -> 660
656 <- 660 < T629_U21160_M0> -> 663
660 <- 663 < T632_U21166_M0> -> 665
663 <- 665 < T634_U21169_M0> -> 666
665 <- 666 < T635_U21170_M0> -> 667
666 <- 667 < T636_U21171_M0> -> 668
667 <- 668 < T637_U21172_M0> -> 669
668 <- 669 < T638_U21173_M0> -> 672
669 <- 672 < T641_U21180_M0> -> 632
672 <- 632 < T601_U21183_M0> -> 675
632 <- 675 < T644_U21187_M0> -> 683
675 <- 683 < T652_U21198_M0> -> 684
683 <- 684 < T653_U21199_M0> -> 685
684 <- 685 < T654_U21201_M0> -> 686
685 <- 686 < T655_U21202_M0> -> 687
686 <- 687 < T656_U21203_M0> -> 688
687 <- 688 < T657_U21204_M0> -> 689
688 <- 689 < T658_U21205_M0> -> 704
689 <- 704 < T673_U21224_M0> -> 705
704 <- 705 < T674_U21228_M0> -> 701
705 <- 701 < T669_U21231_M0> -> 699
701 <- 699 < T666_U21238_M0> -> 695
699 <- 695 < T664_U21241_M0> -> 696
695 <- 696 < T665_U21242_M0> -> 707
696 <- 707 < T676_U21243_M0> -> 708
707 <- 708 < T677_U21244_M0> -> 709
708 <- 709 < T678_U21245_M0> -> 442
709 <- 442 < T411_U21254_M0> -> 715
442 <- 715 < T684_U21259_M0> -> 716
715 <- 716 < T685_U21260_M0> -> 723
716 <- 723 < T692_U21270_M0> -> 724
723 <- 724 < T693_U21272_M0> -> 725
724 <- 725 < T694_U21273_M0> -> 726
725 <- 726 < T695_U21274_M0> -> 727
726 <- 727 < T696_U21275_M0> -> 728
727 <- 728 < T697_U21276_M0> -> 737
728 <- 737 < T706_U21293_M0> -> 738
737 <- 738 < T707_U21294_M0> -> 743
738 <- 743 < T712_U21303_M0> -> 744
743 <- 744 < T713_U21304_M0> -> 745
744 <- 745 < T714_U21305_M0> -> 746
745 <- 746 < T715_U21306_M0> -> 747
746 <- 747 < T716_U21308_M0> -> 748
747 <- 748 < T717_U21315_M0> -> 752
748 <- 752 < T721_U21323_M0> -> 753
752 <- 753 < T722_U21325_M0> -> 754
753 <- 754 < T723_U21326_M0> -> 755
754 <- 755 < T724_U21327_M0> -> 756
755 <- 756 < T725_U21328_M0> -> 757
756 <- 757 < T726_U21330_M0> -> 766
757 <- 766 < T735_U21342_M0> -> 768
766 <- 768 < T737_U21347_M0> -> 779
768 <- 779 < T748_U21363_M0> -> 780
779 <- 780 < T749_U21364_M0> -> 781
780 <- 781 < T750_U21365_M0> -> 782
781 <- 782 < T751_U21366_M0> -> 783
782 <- 783 < T752_U21368_M0> -> 784
783 <- 784 < T753_U21372_M0> -> 594
784 <- 594 < T563_U21374_M0> -> 787
594 <- 787 < T756_U21379_M0> -> 791
787 <- 791 < T760_U21385_M0> -> 792
791 <- 792 < T761_U21386_M0> -> 793
792 <- 793 < T762_U21388_M0> -> 794
793 <- 794 < T763_U21389_M0> -> 795
794 <- 795 < T764_U21390_M0> -> 796
795 <- 796 < T765_U21392_M0> -> 797
796 <- 797 < T766_U21394_M0> -> 802
797 <- 802 < T772_U21419_M0> -> 803
802 <- 803 < T768_U21424_M0> -> 816
803 <- 816 < T785_U21431_M0> -> 817
816 <- 817 < T786_U21432_M0> -> 818
817 <- 818 < T787_U21433_M0> -> 819
818 <- 819 < T788_U21435_M0> -> 820
819 <- 820 < T789_U21437_M0> -> 603
820 <- 603 < T572_U21442_M0> -> 824
603 <- 824 < T793_U21447_M0> -> 829
824 <- 829 < T798_U21453_M0> -> 831
829 <- 831 < T800_U21456_M0> -> 832
831 <- 832 < T801_U21458_M0> -> 833
832 <- 833 < T802_U21459_M0> -> 834
833 <- 834 < T803_U21460_M0> -> 835
834 <- 835 < T804_U21461_M0> -> 836
835 <- 836 < T805_U21464_M0> -> 837
836 <- 837 < T806_U21466_M0> -> 842
837 <- 842 < T811_U21475_M0> -> 826
842 <- 826 < T795_U21480_M0> -> 848
826 <- 848 < T817_U21486_M0> -> 849
848 <- 849 < T818_U21487_M0> -> 850
849 <- 850 < T819_U21488_M0> -> 851
850 <- 851 < T820_U21489_M0> -> 852
851 <- 852 < T821_U21491_M0> -> 846
852 <- 846 < T815_U21493_M0> -> 855
846 <- 855 < T824_U21497_M0> -> 805
855 <- 805 < T773_U21498_M0> -> 856
805 <- 856 < T825_U21501_M0> -> 862
856 <- 862 < T831_U21511_M0> -> 863
862 <- 863 < T832_U21513_M0> -> 864
863 <- 864 < T833_U21514_M0> -> 865
864 <- 865 < T834_U21515_M0> -> 866
865 <- 866 < T835_U21516_M0> -> 868
866 <- 868 < T837_U21520_M0> -> 869
868 <- 869 < T838_U21523_M0> -> 875
869 <- 875 < T844_U21536_M0> -> 877
875 <- 877 < T846_U21539_M0> -> 878
877 <- 878 < T847_U21540_M0> -> 879
878 <- 879 < T848_U21541_M0> -> 880
879 <- 880 < T849_U21542_M0> -> 881
880 <- 881 < T850_U21545_M0> -> 453
881 <- 453 < T422_U21554_M0> -> 888
453 <- 888 < T857_U21558_M0> -> 893
888 <- 893 < T862_U21565_M0> -> 894
893 <- 894 < T863_U21567_M0> -> 895
894 <- 895 < T864_U21568_M0> -> 896
895 <- 896 < T865_U21569_M0> -> 897
896 <- 897 < T866_U21571_M0> -> 898
897 <- 898 < T867_U21573_M0> -> 890
898 <- 890 < T859_U21588_M0> -> 909
890 <- 909 < T878_U21589_M0> -> 906
909 <- 906 < T873_U21590_M0> -> 905
906 <- 905 < T874_U21591_M0> -> 904
905 <- 904 < T875_U21593_M0> -> 903
904 <- 903 < T872_U21594_M0> -> 902
903 <- 902 < T871_U21595_M0> -> 901
902 <- 901 < T868_U21597_M0> -> 900
901 <- 900 < T869_U21598_M0> -> 899
900 <- 899 < T870_U21602_M0> -> 910
899 <- 910 < T879_U21603_M0> -> 911
910 <- 911 < T880_U21604_M0> -> 912
911 <- 912 < T881_U21606_M0> -> 913
912 <- 913 < T882_U21608_M0> -> 914
913 <- 914 < T883_U21610_M0> -> 710
914 <- 710 < T679_U21614_M0> -> 886
710 <- 886 < T855_U21617_M0> -> 916
886 <- 916 < T885_U21618_M0> -> 917
916 <- 917 < T886_U21621_M0> -> 918
917 <- 918 < T887_U21622_M0> -> 919
918 <- 919 < T888_U21623_M0> -> 920
919 <- 920 < T889_U21625_M0> -> 921
920 <- 921 < T890_U21626_M0> -> 922
921 <- 922 < T891_U21627_M0> -> 923
922 <- 923 < T892_U21629_M0> -> 924
923 <- 924 < T893_U21631_M0> -> 925
924 <- 925 < T894_U21633_M0> -> 926
925 <- 926 < T895_U21634_M0> -> 927
926 <- 927 < T896_U21635_M0> -> 928
927 <- 928 < T897_U21637_M0> -> 929
928 <- 929 < T898_U21638_M0> -> 930
929 <- 930 < T899_U21640_M0> -> -1
Session queue dump (low priority, 2 elements, peak 25):
-1 <- 140 < T30_U25456_M0> -> 164
140 <- 164 < T103_U20248_M0> -> -1

Requests in queue <W2> (1 requests, queue in use):


- 1 requests for handler REQ_HANDLER_PROCESS
Requests in queue <T138_U20086_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T107_U19987_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T14_U20145_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T109_U5012_M1> (2 requests, queue in use):
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T63_U20029_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T68_U19968_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T66_U20030_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T142_U20002_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T100_U20195_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T51_U20215_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T64_U20148_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T120_U20137_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T18_U20020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T24_U20019_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T76_U20199_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T116_U20039_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T4_U20075_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T97_U19999_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T13_U20026_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T80_U19969_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T98_U18282_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T26_U20099_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T124_U20081_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T23_U20175_M0> (16 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T96_U20125_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T35_U20203_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T3_U20015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T77_U20172_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T25_U20202_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T15_U20162_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T104_U20107_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T121_U20226_M0> (16 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T41_U20100_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T111_U20073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T29_U20080_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T114_U20014_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T122_U20050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T83_U20022_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T61_U20113_M0> (17 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 8 requests for handler REQ_HANDLER_SESSION
Requests in queue <T108_U20154_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T36_U20219_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T99_U20032_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T65_U19963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T42_U20205_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T33_U20262_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T20_U20221_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T0_U20103_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T88_U20160_M0> (29 requests):
- 20 requests for handler REQ_HANDLER_PLUGIN
- 9 requests for handler REQ_HANDLER_SESSION
Requests in queue <T105_U19973_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T5_U20044_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T101_U20038_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T141_U20176_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T113_U19967_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T44_U19965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T117_U20074_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T32_U20104_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T54_U19975_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T12_U20220_M0> (16 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T19_U20278_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T8_U20017_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T70_U20109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T40_U20136_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T149_U20149_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T9_U20138_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T127_U19966_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T139_U20112_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T38_U20216_M0> (18 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 8 requests for handler REQ_HANDLER_SESSION
Requests in queue <T34_U20024_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T62_U20045_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T145_U20282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T74_U19995_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T1_U20146_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T49_U19974_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T119_U19971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T6_U20079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T153_U19985_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T94_U20042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T110_U20031_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T82_U20013_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T89_U20156_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T46_U20023_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T2_U20040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T31_U20168_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T58_U20151_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T11_U20406_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T118_U22844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T126_U18279_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T92_U20196_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T95_U20155_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T91_U20163_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T84_U20142_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T55_U19953_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T45_U20180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T155_U20258_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T73_U20178_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T43_U20076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T137_U20236_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T140_U20098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T79_U20212_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T52_U19972_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T71_U20451_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T134_U20046_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T106_U20213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T22_U19960_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T30_U25456_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T57_U19917_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_RFC
- 1 requests for handler REQ_HANDLER_WAIT_RESP
Requests in queue <T48_U20071_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T17_U19983_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T53_U20177_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T67_U19964_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T132_U20225_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T115_U20182_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T123_U20185_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T37_U20132_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T56_U20025_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T7_U20048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T78_U20239_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T129_U20184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T150_U20251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T81_U20034_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T39_U20016_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T102_U20210_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T154_U20261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T128_U20140_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T148_U20238_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T50_U20135_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T59_U19947_M0> (1 requests, queue in use):
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T133_U20159_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T103_U20248_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T75_U20558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T156_U20224_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T16_U20130_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T146_U20253_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T28_U20082_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T93_U20181_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T21_U20194_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T85_U20077_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T60_U20028_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T131_U20333_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T144_U20241_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T157_U20264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T90_U20133_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T130_U20267_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T136_U20158_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T143_U20250_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T87_U19196_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T151_U20265_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T152_U20240_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T135_U20255_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T86_U20069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T27_U20043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T112_U20329_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T125_U20235_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T158_U20273_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T159_U20259_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T160_U20274_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T161_U20349_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T162_U20279_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T163_U20281_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T164_U20283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T165_U20284_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T166_U20285_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T167_U20303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T168_U20288_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T169_U20289_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T170_U20291_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T171_U20295_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T172_U20298_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T173_U20299_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T174_U20300_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T175_U20304_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T176_U20305_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T177_U20306_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T178_U20308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T179_U20309_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T180_U20310_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T181_U20313_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T182_U20314_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T183_U20315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T184_U20316_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T185_U20317_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T186_U20320_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T187_U20325_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T188_U20326_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T189_U20331_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T190_U20350_M0> (22 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 8 requests for handler REQ_HANDLER_SESSION
Requests in queue <T191_U20619_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T192_U20338_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T193_U20341_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T194_U20343_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T195_U20345_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T196_U20346_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T197_U20347_M0> (17 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T198_U20411_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T199_U20353_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T200_U20354_M0> (22 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 8 requests for handler REQ_HANDLER_SESSION
Requests in queue <T201_U20355_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T202_U20357_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T203_U20358_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T204_U20359_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T205_U20360_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T206_U20362_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T207_U20363_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T208_U20991_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T209_U20370_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T210_U20438_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T212_U20408_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T217_U20407_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T218_U20394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T213_U20687_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T215_U20400_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T211_U20403_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T214_U20638_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T216_U20391_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T219_U20381_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T220_U20382_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T221_U20383_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T222_U20386_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T223_U20387_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T224_U20389_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T225_U20410_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T226_U20412_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T227_U20432_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T228_U20415_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T229_U20416_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T230_U20417_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T231_U20419_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T232_U20424_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T233_U20426_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T234_U20427_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T235_U20428_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T236_U20429_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T237_U20433_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T238_U20435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T239_U20436_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T240_U20455_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T241_U20440_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T242_U20441_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T243_U20466_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T244_U20464_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T245_U20504_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T246_U20530_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T247_U20447_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T248_U20456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T249_U20458_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T250_U20467_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T251_U20559_M0> (21 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T252_U20469_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T253_U20470_M0> (16 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T254_U20474_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T255_U20475_M0> (21 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T256_U20748_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T257_U20478_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T258_U20479_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T259_U20480_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T260_U20481_M0> (21 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 7 requests for handler REQ_HANDLER_SESSION
Requests in queue <T261_U20483_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T262_U20484_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T263_U20485_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T264_U20486_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T265_U20488_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T266_U20489_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T267_U20801_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T268_U20492_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T269_U20497_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T270_U20802_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T271_U20500_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T272_U20501_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T273_U20507_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T274_U20509_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T275_U20514_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T277_U20600_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T276_U20537_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T280_U20543_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T282_U20544_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T279_U20542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T283_U20541_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T281_U20535_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T278_U20536_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T284_U20523_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T285_U20525_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T286_U20526_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T287_U20582_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T288_U20578_M0> (14 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T289_U20532_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T290_U20533_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T291_U20546_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T292_U20547_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T293_U20809_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T294_U20553_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T295_U20554_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T296_U20636_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T297_U20560_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T298_U20562_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T299_U20563_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T300_U20907_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T301_U20567_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T302_U20568_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T303_U20574_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T304_U20575_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T305_U20580_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T306_U20581_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T307_U20585_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T308_U20586_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T309_U20587_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T310_U20592_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T311_U20594_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T312_U20595_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T313_U20596_M0> (16 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T314_U20597_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T315_U20602_M0> (20 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 6 requests for handler REQ_HANDLER_SESSION
Requests in queue <T316_U20603_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T317_U20605_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T318_U20606_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T319_U20613_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T320_U20615_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T321_U20616_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T322_U20620_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T323_U20621_M0> (13 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T324_U20622_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T325_U20624_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T326_U20626_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T327_U20627_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T328_U20630_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T329_U20632_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T330_U20639_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T331_U20641_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T332_U20642_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T333_U20643_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T334_U20692_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T335_U20645_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T336_U20648_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T337_U20653_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T338_U20717_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T341_U20690_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T342_U20674_M0> (13 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T343_U20691_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T345_U20679_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T340_U20677_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T344_U20676_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T339_U20673_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T346_U20990_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T347_U20664_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T348_U20723_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T349_U20669_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T350_U20670_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T351_U20671_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T352_U20716_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T353_U20694_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T354_U20697_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T355_U20698_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T356_U20700_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T357_U20702_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T358_U20703_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T359_U20704_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T360_U20705_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T361_U20707_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T362_U20711_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T363_U20712_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T364_U20713_M0> (15 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T365_U20714_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T366_U20718_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T367_U20720_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T368_U20721_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T369_U20722_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T370_U20724_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T371_U20725_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T372_U20727_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T373_U20738_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T374_U20729_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T375_U20731_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T376_U20732_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T377_U20733_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T378_U20734_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T379_U20736_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T380_U20739_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T381_U20740_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T382_U20741_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T383_U20742_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T384_U20876_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T385_U20745_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T386_U20750_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T387_U20751_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T388_U20752_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T389_U20754_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T390_U20755_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T391_U20758_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T392_U20760_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T393_U20761_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T394_U20762_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T395_U20763_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T396_U20764_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T397_U20769_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T398_U20877_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T399_U20774_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T400_U20775_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T401_U20776_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T402_U20777_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T404_U20779_M0> (15 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T403_U20778_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T405_U20781_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T406_U20782_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T407_U20783_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T408_U20784_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T409_U20785_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T410_U20786_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T411_U21254_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T412_U20789_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T413_U20790_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T414_U20793_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T415_U20795_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T416_U20796_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T417_U20797_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T418_U20798_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T419_U20799_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T420_U20803_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T421_U20804_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T422_U21554_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T423_U20807_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T424_U20810_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T425_U20812_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T426_U20813_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T427_U20814_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T428_U20844_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T429_U20817_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T430_U20818_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T431_U20819_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T432_U20821_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T433_U20823_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T434_U20824_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T435_U20825_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T436_U20827_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T437_U20828_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T438_U20961_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T439_U20856_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T444_U20853_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T440_U20854_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T442_U20851_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T443_U20850_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T445_U20936_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T441_U20847_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T446_U20840_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T447_U20910_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T448_U20845_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T449_U20859_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T450_U20860_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T451_U20861_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T452_U20863_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T453_U20864_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T454_U20867_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T455_U20868_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T456_U20869_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T457_U20870_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T458_U20872_M0> (14 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T459_U20873_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T460_U20874_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T461_U20878_M0> (21 requests):
- 16 requests for handler REQ_HANDLER_PLUGIN
- 5 requests for handler REQ_HANDLER_SESSION
Requests in queue <T462_U20880_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T463_U20881_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T464_U20911_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T465_U20884_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T466_U20886_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T467_U20887_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T468_U20888_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T469_U20890_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T470_U20891_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T471_U20892_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T472_U20893_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T473_U20896_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T474_U20897_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T475_U20899_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T476_U20901_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T477_U20902_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T478_U20903_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T479_U20908_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T480_U20909_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T481_U20912_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T482_U20914_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T483_U20929_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T484_U20916_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T485_U20917_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T486_U20918_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T487_U20919_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T488_U20921_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T489_U20922_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T490_U20923_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T491_U20924_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T492_U20926_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T493_U20927_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T494_U20930_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T495_U20931_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T496_U20933_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T497_U20937_M0> (12 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T498_U20938_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T499_U20939_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T500_U20941_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T501_U20942_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T502_U20946_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T503_U20947_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T504_U20949_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T505_U20950_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T506_U20951_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T507_U20952_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T508_U20953_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T509_U20958_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T510_U20962_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T511_U20963_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T512_U20965_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T513_U20966_M0> (23 requests):
- 19 requests for handler REQ_HANDLER_PLUGIN
- 4 requests for handler REQ_HANDLER_SESSION
Requests in queue <T514_U20967_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T515_U20968_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T516_U20969_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T517_U20971_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T518_U20974_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T519_U20975_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T520_U20976_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T521_U20977_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T523_U20998_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T522_U20997_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T524_U21071_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T525_U21069_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T526_U21370_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T527_U20984_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T528_U20985_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T529_U20986_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T530_U20987_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T531_U20988_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T532_U20992_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T533_U20994_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T534_U21096_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T535_U21000_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T536_U21001_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T537_U21002_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T538_U21004_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T539_U21007_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T540_U21008_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T541_U21009_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T542_U21010_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T543_U21012_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T544_U21013_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T545_U21014_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T546_U21015_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T547_U21016_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T548_U21020_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T549_U21103_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T550_U21049_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T551_U21047_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T552_U21048_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T553_U21041_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T556_U21043_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T555_U21042_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T554_U21040_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T557_U21129_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T558_U21031_M0> (13 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T559_U21035_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T560_U21036_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T561_U21037_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T562_U21050_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T563_U21374_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T564_U21053_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T565_U21054_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T566_U21055_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T567_U21056_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T568_U21057_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T569_U21061_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T570_U21062_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T571_U21063_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T572_U21442_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T573_U21066_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T574_U21067_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T575_U21072_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T576_U21073_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T577_U21097_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T578_U21075_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T579_U21076_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T580_U21079_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T581_U21080_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T582_U21081_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T583_U21082_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T584_U21084_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T585_U21085_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T586_U21086_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T587_U21088_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T588_U21089_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T589_U21092_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T590_U21094_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T591_U21098_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T592_U21099_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T593_U21101_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T594_U21102_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T595_U21104_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T596_U21105_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T597_U21120_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T598_U21108_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T599_U21109_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T600_U21110_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T601_U21183_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T602_U21113_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T603_U21114_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T604_U21115_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T605_U21117_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T606_U21118_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T607_U21121_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T608_U21122_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T609_U21473_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T610_U21125_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T611_U21375_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T612_U21130_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T613_U21131_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T614_U21133_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T615_U21134_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T616_U21135_M0> (11 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T617_U21136_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T618_U21138_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T619_U21141_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T620_U21142_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T621_U21144_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T622_U21145_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T623_U21146_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T624_U21147_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T625_U21148_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T626_U21152_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T627_U21157_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T628_U21159_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T629_U21160_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T630_U21161_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T631_U21162_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T632_U21166_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T633_U21167_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T634_U21169_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T635_U21170_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T636_U21171_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T637_U21172_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T638_U21173_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T639_U21178_M0> (12 requests):
- 10 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T640_U21310_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T641_U21180_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T642_U21184_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T643_U21185_M0> (17 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 3 requests for handler REQ_HANDLER_SESSION
Requests in queue <T644_U21187_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T645_U21188_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T646_U21189_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T647_U21190_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T648_U21191_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T649_U21192_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T650_U21193_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T651_U21197_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T652_U21198_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T653_U21199_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T654_U21201_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T655_U21202_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T656_U21203_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T657_U21204_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T658_U21205_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T659_U21209_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T660_U21210_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T661_U21211_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T662_U21212_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T663_U21213_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T664_U21241_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T665_U21242_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T667_U21553_M0> (7 requests):
- 6 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T671_U21232_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T666_U21238_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T668_U21233_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T669_U21231_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T670_U21234_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T672_U21223_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T673_U21224_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T674_U21228_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T675_U21230_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T676_U21243_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T677_U21244_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T678_U21245_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T679_U21614_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T680_U21251_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T681_U21255_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T682_U21256_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T683_U21257_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T684_U21259_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T685_U21260_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T686_U21261_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T687_U21262_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T688_U21263_M0> (6 requests):
- 5 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T689_U21264_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T690_U21266_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T691_U21269_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T692_U21270_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T693_U21272_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T694_U21273_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T695_U21274_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T696_U21275_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T697_U21276_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T698_U21280_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T699_U21281_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T700_U21282_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T701_U21283_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T702_U21285_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T703_U21289_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T704_U21290_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T705_U21292_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T706_U21293_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T707_U21294_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T708_U21295_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T709_U21296_M0> (16 requests):
- 14 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T710_U21297_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T711_U21301_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T712_U21303_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T713_U21304_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T714_U21305_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T715_U21306_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T716_U21308_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T717_U21315_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T718_U21317_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T719_U21318_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T720_U21322_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T721_U21323_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T722_U21325_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T723_U21326_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T724_U21327_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T725_U21328_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T726_U21330_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T727_U21333_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T728_U21334_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T729_U21335_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T730_U21336_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T731_U21337_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T732_U21339_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T733_U21340_M0> (10 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T734_U21341_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T735_U21342_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T736_U21346_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T737_U21347_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T738_U21348_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T739_U21349_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T740_U21351_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T741_U21352_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T742_U21353_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T743_U21354_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T744_U21355_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T745_U21356_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T746_U21358_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T747_U21361_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T748_U21363_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T749_U21364_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T750_U21365_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T751_U21366_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T752_U21368_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T753_U21372_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T754_U21377_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T755_U21443_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T756_U21379_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T757_U21380_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T758_U21382_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T759_U21384_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T760_U21385_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T761_U21386_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T762_U21388_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T763_U21389_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T764_U21390_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T765_U21392_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T766_U21394_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T767_U21521_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T770_U21420_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T769_U21426_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T774_U21428_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T772_U21419_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T768_U21424_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T771_U21421_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T773_U21498_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T775_U21404_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T776_U21405_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T777_U21406_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T778_U21407_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T779_U21409_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T780_U21479_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T781_U21411_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T782_U21415_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T783_U21416_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T784_U21417_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T785_U21431_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T786_U21432_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T787_U21433_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T788_U21435_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T789_U21437_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T790_U21439_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T791_U21444_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T792_U21445_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T793_U21447_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T794_U21448_M0> (15 requests):
- 13 requests for handler REQ_HANDLER_PLUGIN
- 2 requests for handler REQ_HANDLER_SESSION
Requests in queue <T795_U21480_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T796_U21451_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T797_U21452_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T798_U21453_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T799_U21455_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T800_U21456_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T801_U21458_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T802_U21459_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T803_U21460_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T804_U21461_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T805_U21464_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T806_U21466_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T807_U21467_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T808_U21468_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T809_U21469_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T810_U21471_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T811_U21475_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T812_U21476_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T813_U21478_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T814_U21482_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T815_U21493_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T816_U21485_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T817_U21486_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T818_U21487_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T819_U21488_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T820_U21489_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T821_U21491_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T822_U21494_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T823_U21495_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T824_U21497_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T825_U21501_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T826_U21502_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T827_U21504_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T828_U21505_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T829_U21506_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T830_U21507_M0> (3 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T831_U21511_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T832_U21513_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T833_U21514_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T834_U21515_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T835_U21516_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T836_U21546_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T837_U21520_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T838_U21523_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T839_U21527_M0> (9 requests):
- 8 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T840_U21528_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T841_U21530_M0> (5 requests):
- 4 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T842_U21531_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T843_U21535_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T844_U21536_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T845_U21538_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T846_U21539_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T847_U21540_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T848_U21541_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T849_U21542_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T850_U21545_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T851_U21547_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T852_U21548_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T853_U21550_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T855_U21617_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T856_U21556_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T857_U21558_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T858_U21559_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T859_U21588_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T860_U21561_M0> (4 requests):
- 3 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T861_U21562_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T862_U21565_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T863_U21567_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T864_U21568_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T865_U21569_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T866_U21571_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T867_U21573_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T870_U21602_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T869_U21598_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T868_U21597_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T871_U21595_M0> (18 requests):
- 18 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T872_U21594_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T875_U21593_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T874_U21591_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T873_U21590_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T876_U21583_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T877_U21585_M0> (2 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
- 1 requests for handler REQ_HANDLER_SESSION
Requests in queue <T878_U21589_M0> (2 requests):
- 2 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T879_U21603_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T880_U21604_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T881_U21606_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T882_U21608_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T883_U21610_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T885_U21618_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T886_U21621_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T887_U21622_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T888_U21623_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T889_U21625_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T890_U21626_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T891_U21627_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T892_U21629_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T893_U21631_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC
Requests in queue <T894_U21633_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T895_U21634_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T896_U21635_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T897_U21637_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T898_U21638_M0> (9 requests):
- 9 requests for handler REQ_HANDLER_PLUGIN
Requests in queue <T899_U21640_M0> (1 requests):
- 1 requests for handler REQ_HANDLER_RFC

Infos about some special queues:

Queue <DispatcherQueue> in slot 0 (port=18802) has no requests


Queue <GatewayQueue> in slot 1 (port=15190) has no requests
Queue <IcmanQueue> in slot 2 (port=27708) has no requests
Queue <StartServiceQueue> in slot 3 (port=0) has no requests

Workprocess Table (long) Sun Sep 22 05:30:43 2019


------------------------------------------------------------

Server is overloaded
Current snapshot id: 189
DB clean time (in percent of total time) : 24.43 %
Number of preemptions : 88

|No |Pid |Type|State |Cause|Err|Prio|Sess-Key |Sess-Type|Locked|Sem|


Time |Program |Cli|User |Action
|Action-Info |
|---|--------|----|-------|-----|---|----|----------------|---------|------|---|---
--|----------------------------------------|---|------------|--------------------|-
-------------------|
| 0| |DIA |WP_KILL| |13 |norm|T22_U19960_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|
| 1| |DIA |WP_KILL| |219|norm|T59_U19947_M0 |HTTP_NORM| | |
|SAPMHTTP |001|SM_EXTERN_WS| |
|
| 2|32121 |DIA |WP_HOLD|RFC | |low |T10_U9773_M0 |ASYNC_RFC| | |
10586|SAPMSSY1 |001|SM_EFWK |
| |
| 3|19909 |DIA |WP_RUN | | |high|T884_U21639_M0 |INTERNAL | | |
3| | | | |
|
| 4|19804 |DIA |WP_RUN | | |high|T854_U21632_M0 |INTERNAL | | |
9| |000|SAPSYS | |
|
| 5| |DIA |WP_KILL| |14 |high|T109_U5012_M1 |GUI | | |
|SBAL_DELETE |001|EXT_SCHAITAN| |
|
| 6| |DIA |WP_KILL| |14 | | | | | |
| | | | |
|
| 7| |DIA |WP_KILL| |13 |norm|T55_U19953_M0 |HTTP_NORM| | |
| |001|SM_EXTERN_WS| |
|

Found 8 active workprocesses


Total number of workprocesses is 16

Session Table Sun Sep 22 05:30:43 2019


------------------------------------------------------------

|Logon-Type |Sess-Key |Cli|User |Terminal |Time |WP |


Program |Prio|Tasks|Application-Info
|Tcode |ES Mem(KB)|
|------------|----------------|---|------------|--------------------|--------|---|-
---------------------------------------|----|-----|--------------------------------
------------------|----------|----------|
|HTTP_NORMAL |T0_U20103_M0 | | |10.54.36.26 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T1_U20146_M0 | | |10.54.36.37 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T2_U20040_M0 | | |10.54.36.37 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T3_U20015_M0 | | |smprd02.niladv.org |04:32:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T4_U20075_M0 | | |10.54.36.19 |04:34:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T5_U20044_M0 | | |10.54.36.28 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T6_U20079_M0 | | |10.54.36.17 |04:35:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T7_U20048_M0 | | |10.54.36.29 |04:33:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T8_U20017_M0 | | |10.54.36.27 |04:32:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T9_U20138_M0 | | |10.54.36.36 |04:37:03| |
|norm|3 | | |
0|
|ASYNC_RFC |T10_U9773_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|2 |
SAPMSSY1 |low | |
| | 4215|
|HTTP_NORMAL |T11_U20406_M0 | | |10.54.36.29 |04:47:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T12_U20220_M0 | | |10.54.36.32 |04:40:50| |
|norm|16 | | |
0|
|HTTP_NORMAL |T13_U20026_M0 | | |10.54.36.12 |04:33:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T14_U20145_M0 | | |10.54.36.13 |04:37:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T15_U20162_M0 | | |10.54.36.38 |04:38:02| |
|norm|7 | | |
0|
|HTTP_NORMAL |T16_U20130_M0 | | |10.50.47.10 |04:36:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T17_U19983_M0 | | |10.54.36.13 |04:31:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T18_U20020_M0 | | |10.50.47.10 |04:32:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T19_U20278_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T20_U20221_M0 | | |10.54.36.27 |04:40:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T21_U20194_M0 | | |10.54.36.37 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T22_U19960_M0 |001|SM_EXTERN_WS|10.54.36.27 |04:32:00|0 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T23_U20175_M0 | | |10.54.36.29 |04:38:48| |
|norm|16 | | |
0|
|SYNC_RFC |T24_U20019_M0 | | |smprd02.niladv.org |04:32:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T25_U20202_M0 | | |10.54.36.12 |04:40:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T26_U20099_M0 | | |10.54.36.13 |04:35:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T27_U20043_M0 | | |10.54.36.11 |04:33:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T28_U20082_M0 | | |10.54.36.30 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T29_U20080_M0 | | |10.54.36.25 |04:35:02| |
|norm|3 | | |
0|
|ASYNC_RFC |T30_U25456_M0 |001|EXT_SCHAITAN| |04:30:00|4 |
SAPMSSY1 |low |2 |
| | 4237|
|HTTP_NORMAL |T31_U20168_M0 | | |10.54.36.35 |04:38:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T32_U20104_M0 | | |10.54.36.28 |04:35:33| |
|norm|3 | | |
0|
|SYNC_RFC |T33_U20262_M0 | | |smprd02.niladv.org |04:42:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T34_U20024_M0 | | |10.54.36.19 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T35_U20203_M0 | | |10.54.36.36 |04:40:03| |
|norm|3 | | |
0|
|SYNC_RFC |T36_U20219_M0 | | |smprd02.niladv.org |04:40:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T37_U20132_M0 | | |10.54.36.34 |04:36:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T38_U20216_M0 | | |10.54.36.29 |04:40:38| |
|norm|18 | | |
0|
|HTTP_NORMAL |T39_U20016_M0 | | |10.54.36.32 |04:32:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T40_U20136_M0 | | |10.54.36.25 |04:37:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T41_U20100_M0 | | |10.54.36.37 |04:35:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T42_U20205_M0 | | |10.50.47.13 |04:40:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T43_U20076_M0 | | |10.54.36.15 |04:35:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T44_U19965_M0 | | |10.54.36.33 |04:30:57| |
|norm|3 | | |
0|
|SYNC_RFC |T45_U20180_M0 | | |smprd02.niladv.org |04:38:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T46_U20023_M0 | | |10.54.36.34 |04:32:58| |
|norm|3 | | |
0|
|ASYNC_RFC |T47_U9774_M0 |001|SM_EFWK |smprd02.niladv.org |00:06:16|0 |
SAPMSSY1 |low | |
| | 8342|
|HTTP_NORMAL |T48_U20071_M0 | | |10.50.47.10 |04:34:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T49_U19974_M0 | | |10.54.36.36 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T50_U20135_M0 | | |10.54.36.17 |04:37:02| |
|norm|3 | | |
0|
|SYNC_RFC |T51_U20215_M0 | | |smprd02.niladv.org |04:40:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T52_U19972_M0 | | |10.54.36.25 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T53_U20177_M0 | | |10.54.36.32 |04:38:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T54_U19975_M0 | | |10.54.36.30 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T55_U19953_M0 |001|SM_EXTERN_WS|10.54.36.14 |04:31:18|7 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T56_U20025_M0 | | |10.54.36.15 |04:33:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T57_U19917_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|7 |
SAPMSSY1 |norm|2 |
| | 4246|
|HTTP_NORMAL |T58_U20151_M0 | | |10.54.36.14 |04:37:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T59_U19947_M0 |001|SM_EXTERN_WS|10.54.36.13 |04:30:42|1 |
SAPMHTTP |norm|1 |
| | 4590|
|HTTP_NORMAL |T60_U20028_M0 | | |10.54.36.17 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T61_U20113_M0 | | |10.54.36.32 |04:35:50| |
|norm|17 | | |
0|
|HTTP_NORMAL |T62_U20045_M0 | | |10.54.36.14 |04:33:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T63_U20029_M0 | | |10.54.36.25 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T64_U20148_M0 | | |10.54.36.26 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T65_U19963_M0 | | |10.50.47.10 |04:30:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T66_U20030_M0 | | |10.54.36.38 |04:33:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T67_U19964_M0 | | |10.54.36.40 |04:30:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T68_U19968_M0 | | |10.54.36.15 |04:31:00| |
|norm|5 | | |
0|
|ASYNC_RFC |T69_U19918_M0 |001|BGRFC_SUSR |smprd02.niladv.org |04:29:56|4 |
SAPMSSY1 |norm| |
| | 4203|
|HTTP_NORMAL |T70_U20109_M0 | | |10.54.36.41 |04:35:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T71_U20451_M0 | | |10.54.36.35 |04:49:29| |
|norm|3 | | |
0|
|ASYNC_RFC |T72_U18046_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4247|
|HTTP_NORMAL |T73_U20178_M0 | | |10.54.36.27 |04:38:50| |
|norm|3 | | |
0|
|SYNC_RFC |T74_U19995_M0 | | |smprd02.niladv.org |04:31:48| |
|norm|1 | | |
0|
|SYNC_RFC |T75_U20558_M0 | | |smprd02.niladv.org |04:53:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T76_U20199_M0 | | |10.50.47.10 |04:39:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T77_U20172_M0 | | |10.54.36.41 |04:38:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T78_U20239_M0 | | |10.54.36.25 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T79_U20212_M0 | | |10.54.36.11 |04:40:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T80_U19969_M0 | | |10.54.36.12 |04:31:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T81_U20034_M0 | | |10.50.47.13 |04:33:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T82_U20013_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T83_U20022_M0 | | |10.54.36.33 |04:32:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T84_U20142_M0 | | |10.50.47.13 |04:37:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T85_U20077_M0 | | |10.54.36.12 |04:35:01| |
|norm|3 | | |
0|
|SYNC_RFC |T86_U20069_M0 | | |smprd02.niladv.org |04:34:48| |
|norm|1 | | |
0|
|SYNC_RFC |T87_U19196_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:05|1 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T88_U20160_M0 | | |10.54.36.19 |04:37:58| |
|norm|29 | | |
0|
|HTTP_NORMAL |T89_U20156_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T90_U20133_M0 | | |10.54.36.15 |04:37:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T91_U20163_M0 | | |10.54.36.30 |04:38:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T92_U20196_M0 | | |10.54.36.14 |04:39:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T93_U20181_M0 | | |10.54.36.34 |04:38:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T94_U20042_M0 | | |10.54.36.26 |04:33:32| |
|norm|3 | | |
0|
|SYNC_RFC |T95_U20155_M0 | | |smprd02.niladv.org |04:37:48| |
|norm|1 | | |
0|
|SYNC_RFC |T96_U20125_M0 | | |smprd02.niladv.org |04:36:37| |
|norm|1 | | |
0|
|SYNC_RFC |T97_U19999_M0 | | |smprd02.niladv.org |04:32:01| |
|norm|1 | | |
0|
|SYNC_RFC |T98_U18282_M0 |001|SMD_RFC |smprd02.niladv.org |04:29:08|4 |
SAPMSSY1 |norm|1 |
| | 4248|
|HTTP_NORMAL |T99_U20032_M0 | | |10.54.36.30 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T100_U20195_M0 | | |10.54.36.26 |04:39:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T101_U20038_M0 | | |10.54.36.35 |04:33:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T102_U20210_M0 | | |10.54.36.13 |04:40:32| |
|norm|3 | | |
0|
|SYNC_RFC |T103_U20248_M0 | | |ascsbwq02.niladv.org|04:41:31| |
|low |1 | | |
0|
|SYNC_RFC |T104_U20107_M0 | | |smprd02.niladv.org |04:35:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T105_U19973_M0 | | |10.54.36.38 |04:31:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T106_U20213_M0 | | |10.54.36.28 |04:40:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T107_U19987_M0 | | |10.54.36.26 |04:31:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T108_U20154_M0 | | |10.54.36.29 |04:37:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T110_U20031_M0 | | |10.54.36.36 |04:33:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T111_U20073_M0 | | |10.54.36.33 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T112_U20329_M0 | | |smprd02.niladv.org |04:44:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T113_U19967_M0 | | |10.54.36.19 |04:30:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T114_U20014_M0 | | |10.54.36.29 |04:32:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T115_U20182_M0 | | |10.54.36.15 |04:39:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T116_U20039_M0 | | |10.54.36.13 |04:33:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T117_U20074_M0 | | |10.54.36.34 |04:34:58| |
|norm|3 | | |
0|
|SYNC_RFC |T118_U22844_M0 |001|SMDAGENT_SMP|smprd02.niladv.org |04:27:11|7 |
SAPMSSY1 |norm|1 |
| | 4233|
|HTTP_NORMAL |T119_U19971_M0 | | |10.54.36.17 |04:31:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T120_U20137_M0 | | |10.54.36.12 |04:37:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T121_U20226_M0 | | |10.54.36.19 |04:40:58| |
|norm|16 | | |
0|
|SYNC_RFC |T122_U20050_M0 | | |smprd02.niladv.org |04:33:52| |
|norm|1 | | |
0|
|HTTP_NORMAL |T123_U20185_M0 | | |10.54.36.17 |04:39:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T124_U20081_M0 | | |10.54.36.36 |04:35:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T125_U20235_M0 | | |10.54.36.34 |04:41:00| |
|norm|3 | | |
0|
|SYNC_RFC |T126_U18279_M0 |001|SMD_RFC |smprd02.niladv.org |04:27:16|0 |
SAPMSSY1 |norm|1 |
| | 4249|
|HTTP_NORMAL |T127_U19966_M0 | | |10.54.36.34 |04:30:57| |
|norm|6 | | |
0|
|SYNC_RFC |T128_U20140_M0 | | |smprd02.niladv.org |04:37:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T129_U20184_M0 | | |10.54.36.25 |04:39:03| |
|norm|3 | | |
0|
|SYNC_RFC |T130_U20267_M0 | | |smprd02.niladv.org |04:42:11| |
|norm|1 | | |
0|
|SYNC_RFC |T131_U20333_M0 | | |smprd02.niladv.org |04:45:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T132_U20225_M0 | | |10.54.36.33 |04:40:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T133_U20159_M0 | | |10.54.36.33 |04:37:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T134_U20046_M0 | | |10.54.36.41 |04:33:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T135_U20255_M0 | | |10.54.36.41 |04:41:39| |
|norm|3 | | |
0|
|SYNC_RFC |T136_U20158_M0 | | |smprd02.niladv.org |04:37:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T137_U20236_M0 | | |10.54.36.15 |04:41:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T138_U20086_M0 | | |10.50.47.13 |04:35:12| |
|norm|4 | | |
0|
|SYNC_RFC |T139_U20112_M0 | | |smprd02.niladv.org |04:35:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T140_U20098_M0 | | |10.54.36.35 |04:35:28| |
|norm|3 | | |
0|
|SYNC_RFC |T141_U20176_M0 | | |smprd02.niladv.org |04:38:49| |
|norm|1 | | |
0|
|SYNC_RFC |T142_U20002_M0 | | |smprd02.niladv.org |04:32:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T143_U20250_M0 | | |10.54.36.37 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T144_U20241_M0 | | |10.54.36.17 |04:41:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T145_U20282_M0 | | |10.54.36.33 |04:42:58| |
|norm|3 | | |
0|
|SYNC_RFC |T146_U20253_M0 | | |smprd02.niladv.org |04:41:37| |
|norm|1 | | |
0|
|ASYNC_RFC |T147_U18048_M0 |001|SM_EFWK |smprd02.niladv.org |23:27:18|5 |
SAPMSSY1 |low | |
| | 4246|
|HTTP_NORMAL |T148_U20238_M0 | | |10.54.36.30 |04:41:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T149_U20149_M0 | | |10.54.36.11 |04:37:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T150_U20251_M0 | | |10.54.36.26 |04:41:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T151_U20265_M0 | | |10.54.36.36 |04:42:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T152_U20240_M0 | | |10.54.36.38 |04:41:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T153_U19985_M0 | | |10.54.36.37 |04:31:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T154_U20261_M0 | | |10.50.47.10 |04:41:54| |
|norm|3 | | |
0|
|SYNC_RFC |T155_U20258_M0 | | |smprd02.niladv.org |04:41:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T156_U20224_M0 | | |10.54.36.40 |04:40:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T157_U20264_M0 | | |10.54.36.12 |04:42:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T158_U20273_M0 | | |10.54.36.13 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T159_U20259_M0 | | |10.54.36.29 |04:41:49| |
|norm|7 | | |
0|
|HTTP_NORMAL |T160_U20274_M0 | | |10.54.36.11 |04:42:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T161_U20349_M0 | | |10.54.36.27 |04:45:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T162_U20279_M0 | | |10.54.36.29 |04:42:48| |
|norm|3 | | |
0|
|SYNC_RFC |T163_U20281_M0 | | |smprd02.niladv.org |04:42:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T164_U20283_M0 | | |10.54.36.19 |04:42:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T165_U20284_M0 | | |10.54.36.34 |04:43:00| |
|norm|3 | | |
0|
|SYNC_RFC |T166_U20285_M0 | | |smprd02.niladv.org |04:43:01| |
|norm|1 | | |
0|
|SYNC_RFC |T167_U20303_M0 | | |smprd02.niladv.org |04:43:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T168_U20288_M0 | | |10.54.36.30 |04:43:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T169_U20289_M0 | | |10.54.36.38 |04:43:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T170_U20291_M0 | | |10.50.47.13 |04:43:11| |
|norm|4 | | |
0|
|HTTP_NORMAL |T171_U20295_M0 | | |10.54.36.35 |04:43:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T172_U20298_M0 | | |10.54.36.37 |04:43:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T173_U20299_M0 | | |10.54.36.14 |04:43:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T174_U20300_M0 | | |10.54.36.41 |04:43:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T175_U20304_M0 | | |10.54.36.27 |04:43:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T176_U20305_M0 | | |10.54.36.29 |04:43:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T177_U20306_M0 | | |10.54.36.32 |04:43:51| |
|norm|4 | | |
0|
|SYNC_RFC |T178_U20308_M0 | | |smprd02.niladv.org |04:43:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T179_U20309_M0 | | |10.50.47.10 |04:43:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T180_U20310_M0 | | |10.54.36.15 |04:44:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T181_U20313_M0 | | |10.54.36.12 |04:44:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T182_U20314_M0 | | |10.54.36.25 |04:44:03| |
|norm|3 | | |
0|
|SYNC_RFC |T183_U20315_M0 | | |smprd02.niladv.org |04:44:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T184_U20316_M0 | | |10.54.36.36 |04:44:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T185_U20317_M0 | | |10.54.36.17 |04:44:05| |
|norm|3 | | |
0|
|SYNC_RFC |T186_U20320_M0 | | |smprd02.niladv.org |04:44:14| |
|norm|1 | | |
0|
|HTTP_NORMAL |T187_U20325_M0 | | |10.54.36.11 |04:44:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T188_U20326_M0 | | |10.54.36.26 |04:44:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T189_U20331_M0 | | |10.54.36.33 |04:44:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T190_U20350_M0 | | |10.54.36.32 |04:45:51| |
|norm|22 | | |
0|
|HTTP_NORMAL |T191_U20619_M0 | | |10.54.36.29 |04:56:48| |
|norm|3 | | |
0|
|SYNC_RFC |T192_U20338_M0 | | |smprd02.niladv.org |04:45:15| |
|norm|1 | | |
0|
|HTTP_NORMAL |T193_U20341_M0 | | |10.54.36.35 |04:45:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T194_U20343_M0 | | |10.54.36.13 |04:45:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T195_U20345_M0 | | |10.54.36.28 |04:45:33| |
|norm|4 | | |
0|
|SYNC_RFC |T196_U20346_M0 | | |smprd02.niladv.org |04:45:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T197_U20347_M0 | | |10.54.36.29 |04:45:38| |
|norm|17 | | |
0|
|HTTP_NORMAL |T198_U20411_M0 | | |10.54.36.34 |04:47:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T199_U20353_M0 | | |10.54.36.34 |04:45:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T200_U20354_M0 | | |10.54.36.19 |04:45:58| |
|norm|22 | | |
0|
|HTTP_NORMAL |T201_U20355_M0 | | |10.54.36.15 |04:46:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T202_U20357_M0 | | |10.54.36.38 |04:46:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T203_U20358_M0 | | |10.54.36.30 |04:46:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T204_U20359_M0 | | |10.54.36.12 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T205_U20360_M0 | | |10.54.36.25 |04:46:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T206_U20362_M0 | | |10.54.36.36 |04:46:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T207_U20363_M0 | | |10.54.36.17 |04:46:05| |
|norm|4 | | |
0|
|HTTP_NORMAL |T208_U20991_M0 | | |10.54.36.11 |05:09:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T209_U20370_M0 | | |10.54.36.11 |04:46:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T210_U20438_M0 | | |10.54.36.15 |04:49:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T211_U20403_M0 | | |10.54.36.28 |04:47:34| |
|norm|6 | | |
0|
|HTTP_NORMAL |T212_U20408_M0 | | |10.54.36.29 |04:47:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T213_U20687_M0 | | |10.54.36.11 |04:59:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T214_U20638_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T215_U20400_M0 | | |10.54.36.35 |04:47:29| |
|norm|3 | | |
0|
|HTTP_NORMAL |T216_U20391_M0 | | |10.54.36.33 |04:46:59| |
|norm|3 | | |
0|
|SYNC_RFC |T217_U20407_M0 | | |smprd02.niladv.org |04:47:49| |
|norm|1 | | |
0|
|SYNC_RFC |T218_U20394_M0 | | |smprd02.niladv.org |04:47:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T219_U20381_M0 | | |10.54.36.14 |04:46:35| |
|norm|4 | | |
0|
|SYNC_RFC |T220_U20382_M0 | | |smprd02.niladv.org |04:46:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T221_U20383_M0 | | |10.54.36.41 |04:46:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T222_U20386_M0 | | |10.54.36.29 |04:46:48| |
|norm|6 | | |
0|
|SYNC_RFC |T223_U20387_M0 | | |smprd02.niladv.org |04:46:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T224_U20389_M0 | | |10.50.47.10 |04:46:53| |
|norm|4 | | |
0|
|SYNC_RFC |T225_U20410_M0 | | |smprd02.niladv.org |04:47:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T226_U20412_M0 | | |10.54.36.19 |04:47:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T227_U20432_M0 | | |10.54.36.32 |04:48:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T228_U20415_M0 | | |10.54.36.12 |04:48:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T229_U20416_M0 | | |10.54.36.38 |04:48:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T230_U20417_M0 | | |10.54.36.30 |04:48:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T231_U20419_M0 | | |10.54.36.17 |04:48:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T232_U20424_M0 | | |10.54.36.37 |04:48:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T233_U20426_M0 | | |10.54.36.11 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T234_U20427_M0 | | |10.54.36.26 |04:48:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T235_U20428_M0 | | |10.54.36.14 |04:48:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T236_U20429_M0 | | |10.54.36.41 |04:48:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T237_U20433_M0 | | |10.54.36.27 |04:48:50| |
|norm|3 | | |
0|
|SYNC_RFC |T238_U20435_M0 | | |smprd02.niladv.org |04:48:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T239_U20436_M0 | | |10.50.47.10 |04:48:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T240_U20455_M0 | | |10.54.36.29 |04:49:48| |
|norm|5 | | |
0|
|HTTP_NORMAL |T241_U20440_M0 | | |10.54.36.25 |04:49:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T242_U20441_M0 | | |10.54.36.36 |04:49:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T243_U20466_M0 | | |10.54.36.37 |04:50:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T244_U20464_M0 | | |10.54.36.13 |04:50:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T245_U20504_M0 | | |10.54.36.33 |04:51:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T246_U20530_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T247_U20447_M0 | | |10.50.47.13 |04:49:12| |
|norm|4 | | |
0|
|SYNC_RFC |T248_U20456_M0 | | |smprd02.niladv.org |04:49:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T249_U20458_M0 | | |10.54.36.33 |04:49:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T250_U20467_M0 | | |10.54.36.28 |04:50:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T251_U20559_M0 | | |10.54.36.32 |04:53:50| |
|norm|21 | | |
0|
|SYNC_RFC |T252_U20469_M0 | | |smprd02.niladv.org |04:50:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T253_U20470_M0 | | |10.54.36.29 |04:50:38| |
|norm|16 | | |
0|
|SYNC_RFC |T254_U20474_M0 | | |smprd02.niladv.org |04:50:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T255_U20475_M0 | | |10.54.36.32 |04:50:50| |
|norm|21 | | |
0|
|SYNC_RFC |T256_U20748_M0 | | |smprd02.niladv.org |05:01:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T257_U20478_M0 | | |10.50.47.10 |04:50:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T258_U20479_M0 | | |10.54.36.40 |04:50:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T259_U20480_M0 | | |10.54.36.34 |04:50:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T260_U20481_M0 | | |10.54.36.19 |04:50:58| |
|norm|21 | | |
0|
|HTTP_NORMAL |T261_U20483_M0 | | |10.54.36.15 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T262_U20484_M0 | | |10.54.36.12 |04:51:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T263_U20485_M0 | | |10.54.36.17 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T264_U20486_M0 | | |10.54.36.38 |04:51:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T265_U20488_M0 | | |10.54.36.25 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T266_U20489_M0 | | |10.54.36.30 |04:51:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T267_U20801_M0 | | |10.54.36.35 |05:03:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T268_U20492_M0 | | |10.50.47.13 |04:51:12| |
|norm|3 | | |
0|
|HTTP_NORMAL |T269_U20497_M0 | | |10.54.36.26 |04:51:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T270_U20802_M0 | | |10.54.36.13 |05:03:30| |
|norm|4 | | |
0|
|SYNC_RFC |T271_U20500_M0 | | |smprd02.niladv.org |04:51:38| |
|norm|1 | | |
0|
|HTTP_NORMAL |T272_U20501_M0 | | |10.54.36.41 |04:51:39| |
|norm|3 | | |
0|
|SYNC_RFC |T273_U20507_M0 | | |smprd02.niladv.org |04:52:01| |
|norm|1 | | |
0|
|SYNC_RFC |T274_U20509_M0 | | |smprd02.niladv.org |04:52:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T275_U20514_M0 | | |10.54.36.35 |04:52:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T276_U20537_M0 | | |10.54.36.34 |04:52:58| |
|norm|3 | | |
0|
|SYNC_RFC |T277_U20600_M0 | | |smprd02.niladv.org |04:55:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T278_U20536_M0 | | |10.50.47.10 |04:52:53| |
|norm|3 | | |
0|
|SYNC_RFC |T279_U20542_M0 | | |smprd02.niladv.org |04:53:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T280_U20543_M0 | | |10.54.36.12 |04:53:01| |
|norm|6 | | |
0|
|SYNC_RFC |T281_U20535_M0 | | |smprd02.niladv.org |04:52:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T282_U20544_M0 | | |10.54.36.17 |04:53:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T283_U20541_M0 | | |10.54.36.15 |04:53:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T284_U20523_M0 | | |10.54.36.13 |04:52:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T285_U20525_M0 | | |10.54.36.37 |04:52:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T286_U20526_M0 | | |10.54.36.28 |04:52:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T287_U20582_M0 | | |10.54.36.12 |04:55:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T288_U20578_M0 | | |10.54.36.29 |04:54:48| |
|norm|14 | | |
0|
|HTTP_NORMAL |T289_U20532_M0 | | |10.54.36.29 |04:52:48| |
|norm|3 | | |
0|
|SYNC_RFC |T290_U20533_M0 | | |smprd02.niladv.org |04:52:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T291_U20546_M0 | | |10.54.36.38 |04:53:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T292_U20547_M0 | | |10.54.36.30 |04:53:05| |
|norm|4 | | |
0|
|SYNC_RFC |T293_U20809_M0 | | |smprd02.niladv.org |05:03:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T294_U20553_M0 | | |10.54.36.26 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T295_U20554_M0 | | |10.54.36.11 |04:53:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T296_U20636_M0 | | |10.54.36.14 |04:57:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T297_U20560_M0 | | |10.54.36.27 |04:53:50| |
|norm|5 | | |
0|
|SYNC_RFC |T298_U20562_M0 | | |smprd02.niladv.org |04:53:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T299_U20563_M0 | | |10.54.36.33 |04:53:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T300_U20907_M0 | | |10.50.47.10 |05:06:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T301_U20567_M0 | | |10.54.36.25 |04:54:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T302_U20568_M0 | | |10.54.36.36 |04:54:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T303_U20574_M0 | | |10.54.36.37 |04:54:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T304_U20575_M0 | | |10.54.36.14 |04:54:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T305_U20580_M0 | | |10.50.47.10 |04:54:54| |
|norm|3 | | |
0|
|HTTP_NORMAL |T306_U20581_M0 | | |10.54.36.34 |04:54:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T307_U20585_M0 | | |10.54.36.17 |04:55:05| |
|norm|3 | | |
0|
|ASYNC_RFC |T308_U20586_M0 | | |10.54.36.10 |04:55:10| |
|norm|1 | | |
0|
|HTTP_NORMAL |T309_U20587_M0 | | |10.50.47.13 |04:55:12| |
|norm|4 | | |
0|
|HTTP_NORMAL |T310_U20592_M0 | | |10.54.36.13 |04:55:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T311_U20594_M0 | | |10.54.36.28 |04:55:33| |
|norm|4 | | |
0|
|HTTP_NORMAL |T312_U20595_M0 | | |10.54.36.26 |04:55:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T313_U20596_M0 | | |10.54.36.29 |04:55:38| |
|norm|16 | | |
0|
|SYNC_RFC |T314_U20597_M0 | | |smprd02.niladv.org |04:55:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T315_U20602_M0 | | |10.54.36.19 |04:55:58| |
|norm|20 | | |
0|
|HTTP_NORMAL |T316_U20603_M0 | | |10.54.36.15 |04:56:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T317_U20605_M0 | | |10.54.36.30 |04:56:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T318_U20606_M0 | | |10.54.36.38 |04:56:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T319_U20613_M0 | | |10.54.36.11 |04:56:32| |
|norm|3 | | |
0|
|SYNC_RFC |T320_U20615_M0 | | |smprd02.niladv.org |04:56:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T321_U20616_M0 | | |10.54.36.41 |04:56:40| |
|norm|3 | | |
0|
|SYNC_RFC |T322_U20620_M0 | | |smprd02.niladv.org |04:56:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T323_U20621_M0 | | |10.54.36.32 |04:56:50| |
|norm|13 | | |
0|
|HTTP_NORMAL |T324_U20622_M0 | | |10.54.36.27 |04:56:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T325_U20624_M0 | | |10.54.36.33 |04:56:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T326_U20626_M0 | | |10.54.36.36 |04:57:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T327_U20627_M0 | | |10.54.36.25 |04:57:03| |
|norm|3 | | |
0|
|SYNC_RFC |T328_U20630_M0 | | |smprd02.niladv.org |04:57:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T329_U20632_M0 | | |10.50.47.13 |04:57:13| |
|norm|4 | | |
0|
|HTTP_NORMAL |T330_U20639_M0 | | |10.54.36.29 |04:57:48| |
|norm|3 | | |
0|
|SYNC_RFC |T331_U20641_M0 | | |smprd02.niladv.org |04:57:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T332_U20642_M0 | | |10.50.47.10 |04:57:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T333_U20643_M0 | | |10.54.36.34 |04:57:58| |
|norm|3 | | |
0|
|SYNC_RFC |T334_U20692_M0 | | |smprd02.niladv.org |04:59:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T335_U20645_M0 | | |10.54.36.12 |04:58:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T336_U20648_M0 | | |10.54.36.17 |04:58:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T337_U20653_M0 | | |10.54.36.35 |04:58:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T338_U20717_M0 | | |10.54.36.32 |05:00:50| |
|norm|3 | | |
0|
|SYNC_RFC |T339_U20673_M0 | | |smprd02.niladv.org |04:58:53| |
|norm|1 | | |
0|
|HTTP_NORMAL |T340_U20677_M0 | | |10.54.36.38 |04:59:02| |
|norm|4 | | |
0|
|SYNC_RFC |T341_U20690_M0 | | |smprd02.niladv.org |04:59:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T342_U20674_M0 | | |10.54.36.19 |04:58:58| |
|norm|13 | | |
0|
|HTTP_NORMAL |T343_U20691_M0 | | |10.54.36.29 |04:59:49| |
|norm|9 | | |
0|
|HTTP_NORMAL |T344_U20676_M0 | | |10.54.36.15 |04:59:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T345_U20679_M0 | | |10.54.36.30 |04:59:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T346_U20990_M0 | | |10.54.36.13 |05:09:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T347_U20664_M0 | | |10.54.36.26 |04:58:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T348_U20723_M0 | | |10.54.36.12 |05:01:01| |
|norm|4 | | |
0|
|SYNC_RFC |T349_U20669_M0 | | |smprd02.niladv.org |04:58:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T350_U20670_M0 | | |10.54.36.32 |04:58:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T351_U20671_M0 | | |10.54.36.27 |04:58:50| |
|norm|4 | | |
0|
|SYNC_RFC |T352_U20716_M0 | | |smprd02.niladv.org |05:00:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T353_U20694_M0 | | |10.54.36.33 |04:59:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T354_U20697_M0 | | |10.54.36.25 |05:00:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T355_U20698_M0 | | |10.54.36.36 |05:00:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T356_U20700_M0 | | |10.50.47.13 |05:00:12| |
|norm|3 | | |
0|
|SYNC_RFC |T357_U20702_M0 | | |smprd02.niladv.org |05:00:14| |
|norm|1 | | |
0|
|SYNC_RFC |T358_U20703_M0 | | |smprd02.niladv.org |05:00:16| |
|norm|1 | | |
0|
|SYNC_RFC |T359_U20704_M0 | | |smprd02.niladv.org |05:00:18| |
|norm|1 | | |
0|
|SYNC_RFC |T360_U20705_M0 | | |smprd02.niladv.org |05:00:20| |
|norm|1 | | |
0|
|SYNC_RFC |T361_U20707_M0 | | |smprd02.niladv.org |05:00:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T362_U20711_M0 | | |10.54.36.37 |05:00:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T363_U20712_M0 | | |10.54.36.14 |05:00:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T364_U20713_M0 | | |10.54.36.29 |05:00:38| |
|norm|15 | | |
0|
|SYNC_RFC |T365_U20714_M0 | | |smprd02.niladv.org |05:00:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T366_U20718_M0 | | |10.54.36.27 |05:00:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T367_U20720_M0 | | |10.50.47.10 |05:00:53| |
|norm|3 | | |
0|
|HTTP_NORMAL |T368_U20721_M0 | | |10.54.36.40 |05:00:56| |
|norm|3 | | |
0|
|HTTP_NORMAL |T369_U20722_M0 | | |10.54.36.34 |05:00:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T370_U20724_M0 | | |10.54.36.15 |05:01:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T371_U20725_M0 | | |10.54.36.17 |05:01:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T372_U20727_M0 | | |10.54.36.38 |05:01:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T373_U20738_M0 | | |10.54.36.35 |05:01:27| |
|norm|3 | | |
0|
|HTTP_NORMAL |T374_U20729_M0 | | |10.54.36.30 |05:01:04| |
|norm|3 | | |
0|
|SYNC_RFC |T375_U20731_M0 | | |smprd02.niladv.org |05:01:14| |
|norm|1 | | |
0|
|SYNC_RFC |T376_U20732_M0 | | |smprd02.niladv.org |05:01:16| |
|norm|1 | | |
0|
|SYNC_RFC |T377_U20733_M0 | | |smprd02.niladv.org |05:01:18| |
|norm|1 | | |
0|
|SYNC_RFC |T378_U20734_M0 | | |smprd02.niladv.org |05:01:20| |
|norm|1 | | |
0|
|SYNC_RFC |T379_U20736_M0 | | |smprd02.niladv.org |05:01:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T380_U20739_M0 | | |10.54.36.13 |05:01:30| |
|norm|3 | | |
0|
|HTTP_NORMAL |T381_U20740_M0 | | |10.54.36.26 |05:01:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T382_U20741_M0 | | |10.54.36.28 |05:01:32| |
|norm|6 | | |
0|
|HTTP_NORMAL |T383_U20742_M0 | | |10.54.36.11 |05:01:32| |
|norm|3 | | |
0|
|SYNC_RFC |T384_U20876_M0 | | |smprd02.niladv.org |05:05:50| |
|norm|1 | | |
0|
|SYNC_RFC |T385_U20745_M0 | | |smprd02.niladv.org |05:01:39| |
|norm|1 | | |
0|
|HTTP_NORMAL |T386_U20750_M0 | | |10.54.36.19 |05:01:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T387_U20751_M0 | | |10.54.36.33 |05:01:58| |
|norm|3 | | |
0|
|SYNC_RFC |T388_U20752_M0 | | |smprd02.niladv.org |05:02:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T389_U20754_M0 | | |10.54.36.25 |05:02:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T390_U20755_M0 | | |10.54.36.36 |05:02:04| |
|norm|3 | | |
0|
|SYNC_RFC |T391_U20758_M0 | | |smprd02.niladv.org |05:02:11| |
|norm|1 | | |
0|
|SYNC_RFC |T392_U20760_M0 | | |smprd02.niladv.org |05:02:14| |
|norm|1 | | |
0|
|SYNC_RFC |T393_U20761_M0 | | |smprd02.niladv.org |05:02:16| |
|norm|1 | | |
0|
|SYNC_RFC |T394_U20762_M0 | | |smprd02.niladv.org |05:02:18| |
|norm|1 | | |
0|
|SYNC_RFC |T395_U20763_M0 | | |smprd02.niladv.org |05:02:20| |
|norm|1 | | |
0|
|SYNC_RFC |T396_U20764_M0 | | |smprd02.niladv.org |05:02:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T397_U20769_M0 | | |10.54.36.37 |05:02:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T398_U20877_M0 | | |10.54.36.27 |05:05:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T399_U20774_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T400_U20775_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T401_U20776_M0 | | |10.54.36.29 |05:02:48| |
|norm|3 | | |
0|
|SYNC_RFC |T402_U20777_M0 | | |smprd02.niladv.org |05:02:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T403_U20778_M0 | | |10.54.36.27 |05:02:50| |
|norm|5 | | |
0|
|HTTP_NORMAL |T404_U20779_M0 | | |10.54.36.32 |05:02:50| |
|norm|15 | | |
0|
|HTTP_NORMAL |T405_U20781_M0 | | |10.50.47.10 |05:02:53| |
|norm|3 | | |
0|
|SYNC_RFC |T406_U20782_M0 | | |smprd02.niladv.org |05:02:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T407_U20783_M0 | | |10.54.36.34 |05:02:58| |
|norm|3 | | |
0|
|SYNC_RFC |T408_U20784_M0 | | |smprd02.niladv.org |05:03:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T409_U20785_M0 | | |10.54.36.12 |05:03:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T410_U20786_M0 | | |10.54.36.17 |05:03:02| |
|norm|3 | | |
0|
|SYNC_RFC |T411_U21254_M0 | | |smprd02.niladv.org |05:17:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T412_U20789_M0 | | |10.54.36.38 |05:03:04| |
|norm|6 | | |
0|
|HTTP_NORMAL |T413_U20790_M0 | | |10.54.36.30 |05:03:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T414_U20793_M0 | | |10.50.47.13 |05:03:11| |
|norm|5 | | |
0|
|SYNC_RFC |T415_U20795_M0 | | |smprd02.niladv.org |05:03:14| |
|norm|1 | | |
0|
|SYNC_RFC |T416_U20796_M0 | | |smprd02.niladv.org |05:03:16| |
|norm|1 | | |
0|
|SYNC_RFC |T417_U20797_M0 | | |smprd02.niladv.org |05:03:18| |
|norm|1 | | |
0|
|SYNC_RFC |T418_U20798_M0 | | |smprd02.niladv.org |05:03:20| |
|norm|1 | | |
0|
|SYNC_RFC |T419_U20799_M0 | | |smprd02.niladv.org |05:03:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T420_U20803_M0 | | |10.54.36.26 |05:03:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T421_U20804_M0 | | |10.54.36.28 |05:03:32| |
|norm|3 | | |
0|
|SYNC_RFC |T422_U21554_M0 | | |smprd02.niladv.org |05:27:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T423_U20807_M0 | | |10.54.36.41 |05:03:39| |
|norm|3 | | |
0|
|SYNC_RFC |T424_U20810_M0 | | |smprd02.niladv.org |05:03:50| |
|norm|1 | | |
0|
|SYNC_RFC |T425_U20812_M0 | | |smprd02.niladv.org |05:03:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T426_U20813_M0 | | |10.54.36.19 |05:03:58| |
|norm|12 | | |
0|
|HTTP_NORMAL |T427_U20814_M0 | | |10.54.36.33 |05:03:58| |
|norm|3 | | |
0|
|SYNC_RFC |T428_U20844_M0 | | |smprd02.niladv.org |05:04:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T429_U20817_M0 | | |10.54.36.25 |05:04:03| |
|norm|4 | | |
0|
|SYNC_RFC |T430_U20818_M0 | | |smprd02.niladv.org |05:04:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T431_U20819_M0 | | |10.54.36.36 |05:04:04| |
|norm|4 | | |
0|
|SYNC_RFC |T432_U20821_M0 | | |smprd02.niladv.org |05:04:10| |
|norm|1 | | |
0|
|SYNC_RFC |T433_U20823_M0 | | |smprd02.niladv.org |05:04:14| |
|norm|1 | | |
0|
|SYNC_RFC |T434_U20824_M0 | | |smprd02.niladv.org |05:04:16| |
|norm|1 | | |
0|
|SYNC_RFC |T435_U20825_M0 | | |smprd02.niladv.org |05:04:18| |
|norm|1 | | |
0|
|SYNC_RFC |T436_U20827_M0 | | |smprd02.niladv.org |05:04:20| |
|norm|1 | | |
0|
|SYNC_RFC |T437_U20828_M0 | | |smprd02.niladv.org |05:04:22| |
|norm|1 | | |
0|
|SYNC_RFC |T438_U20961_M0 | | |smprd02.niladv.org |05:08:46| |
|norm|1 | | |
0|
|SYNC_RFC |T439_U20856_M0 | | |smprd02.niladv.org |05:05:10| |
|norm|1 | | |
0|
|SYNC_RFC |T440_U20854_M0 | | |smprd02.niladv.org |05:05:04| |
|norm|1 | | |
0|
|HTTP_NORMAL |T441_U20847_M0 | | |10.50.47.10 |05:04:54| |
|norm|4 | | |
0|
|HTTP_NORMAL |T442_U20851_M0 | | |10.54.36.17 |05:05:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T443_U20850_M0 | | |10.54.36.12 |05:05:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T444_U20853_M0 | | |10.54.36.15 |05:05:03| |
|norm|5 | | |
0|
|SYNC_RFC |T445_U20936_M0 | | |smprd02.niladv.org |05:07:45| |
|norm|1 | | |
0|
|HTTP_NORMAL |T446_U20840_M0 | | |10.54.36.37 |05:04:33| |
|norm|5 | | |
0|
|HTTP_NORMAL |T447_U20910_M0 | | |10.54.36.12 |05:07:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T448_U20845_M0 | | |10.54.36.29 |05:04:49| |
|norm|3 | | |
0|
|SYNC_RFC |T449_U20859_M0 | | |smprd02.niladv.org |05:05:14| |
|norm|1 | | |
0|
|SYNC_RFC |T450_U20860_M0 | | |smprd02.niladv.org |05:05:16| |
|norm|1 | | |
0|
|SYNC_RFC |T451_U20861_M0 | | |smprd02.niladv.org |05:05:18| |
|norm|1 | | |
0|
|SYNC_RFC |T452_U20863_M0 | | |smprd02.niladv.org |05:05:20| |
|norm|1 | | |
0|
|SYNC_RFC |T453_U20864_M0 | | |smprd02.niladv.org |05:05:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T454_U20867_M0 | | |10.54.36.35 |05:05:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T455_U20868_M0 | | |10.54.36.13 |05:05:31| |
|norm|6 | | |
0|
|HTTP_NORMAL |T456_U20869_M0 | | |10.54.36.26 |05:05:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T457_U20870_M0 | | |10.54.36.28 |05:05:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T458_U20872_M0 | | |10.54.36.29 |05:05:38| |
|norm|14 | | |
0|
|HTTP_NORMAL |T459_U20873_M0 | | |10.54.36.41 |05:05:40| |
|norm|3 | | |
0|
|SYNC_RFC |T460_U20874_M0 | | |smprd02.niladv.org |05:05:40| |
|norm|1 | | |
0|
|HTTP_NORMAL |T461_U20878_M0 | | |10.54.36.32 |05:05:51| |
|norm|21 | | |
0|
|HTTP_NORMAL |T462_U20880_M0 | | |10.54.36.19 |05:05:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T463_U20881_M0 | | |10.54.36.33 |05:05:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T464_U20911_M0 | | |10.54.36.17 |05:07:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T465_U20884_M0 | | |10.54.36.30 |05:06:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T466_U20886_M0 | | |10.54.36.38 |05:06:03| |
|norm|6 | | |
0|
|SYNC_RFC |T467_U20887_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T468_U20888_M0 | | |smprd02.niladv.org |05:06:05| |
|norm|1 | | |
0|
|SYNC_RFC |T469_U20890_M0 | | |smprd02.niladv.org |05:06:14| |
|norm|1 | | |
0|
|SYNC_RFC |T470_U20891_M0 | | |smprd02.niladv.org |05:06:15| |
|norm|1 | | |
0|
|SYNC_RFC |T471_U20892_M0 | | |smprd02.niladv.org |05:06:17| |
|norm|1 | | |
0|
|SYNC_RFC |T472_U20893_M0 | | |smprd02.niladv.org |05:06:18| |
|norm|1 | | |
0|
|SYNC_RFC |T473_U20896_M0 | | |smprd02.niladv.org |05:06:20| |
|norm|1 | | |
0|
|SYNC_RFC |T474_U20897_M0 | | |smprd02.niladv.org |05:06:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T475_U20899_M0 | | |10.54.36.11 |05:06:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T476_U20901_M0 | | |10.54.36.37 |05:06:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T477_U20902_M0 | | |10.54.36.14 |05:06:37| |
|norm|3 | | |
0|
|SYNC_RFC |T478_U20903_M0 | | |smprd02.niladv.org |05:06:40| |
|norm|1 | | |
0|
|SYNC_RFC |T479_U20908_M0 | | |smprd02.niladv.org |05:06:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T480_U20909_M0 | | |10.54.36.34 |05:06:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T481_U20912_M0 | | |10.54.36.25 |05:07:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T482_U20914_M0 | | |10.54.36.36 |05:07:04| |
|norm|4 | | |
0|
|HTTP_NORMAL |T483_U20929_M0 | | |10.54.36.35 |05:07:29| |
|norm|3 | | |
0|
|SYNC_RFC |T484_U20916_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T485_U20917_M0 | | |smprd02.niladv.org |05:07:05| |
|norm|1 | | |
0|
|SYNC_RFC |T486_U20918_M0 | | |smprd02.niladv.org |05:07:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T487_U20919_M0 | | |10.50.47.13 |05:07:12| |
|norm|3 | | |
0|
|SYNC_RFC |T488_U20921_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T489_U20922_M0 | | |smprd02.niladv.org |05:07:15| |
|norm|1 | | |
0|
|SYNC_RFC |T490_U20923_M0 | | |smprd02.niladv.org |05:07:17| |
|norm|1 | | |
0|
|SYNC_RFC |T491_U20924_M0 | | |smprd02.niladv.org |05:07:19| |
|norm|1 | | |
0|
|SYNC_RFC |T492_U20926_M0 | | |smprd02.niladv.org |05:07:21| |
|norm|1 | | |
0|
|SYNC_RFC |T493_U20927_M0 | | |smprd02.niladv.org |05:07:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T494_U20930_M0 | | |10.54.36.13 |05:07:31| |
|norm|4 | | |
0|
|HTTP_NORMAL |T495_U20931_M0 | | |10.54.36.26 |05:07:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T496_U20933_M0 | | |10.54.36.28 |05:07:34| |
|norm|4 | | |
0|
|HTTP_NORMAL |T497_U20937_M0 | | |10.54.36.29 |05:07:48| |
|norm|12 | | |
0|
|HTTP_NORMAL |T498_U20938_M0 | | |10.54.36.29 |05:07:48| |
|norm|3 | | |
0|
|HTTP_NORMAL |T499_U20939_M0 | | |10.54.36.29 |05:07:49| |
|norm|3 | | |
0|
|SYNC_RFC |T500_U20941_M0 | | |smprd02.niladv.org |05:07:59| |
|norm|1 | | |
0|
|HTTP_NORMAL |T501_U20942_M0 | | |10.54.36.15 |05:08:01| |
|norm|5 | | |
0|
|SYNC_RFC |T502_U20946_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T503_U20947_M0 | | |smprd02.niladv.org |05:08:06| |
|norm|1 | | |
0|
|SYNC_RFC |T504_U20949_M0 | | |smprd02.niladv.org |05:08:15| |
|norm|1 | | |
0|
|SYNC_RFC |T505_U20950_M0 | | |smprd02.niladv.org |05:08:17| |
|norm|1 | | |
0|
|SYNC_RFC |T506_U20951_M0 | | |smprd02.niladv.org |05:08:19| |
|norm|1 | | |
0|
|SYNC_RFC |T507_U20952_M0 | | |smprd02.niladv.org |05:08:21| |
|norm|1 | | |
0|
|SYNC_RFC |T508_U20953_M0 | | |smprd02.niladv.org |05:08:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T509_U20958_M0 | | |10.54.36.41 |05:08:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T510_U20962_M0 | | |10.54.36.32 |05:08:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T511_U20963_M0 | | |10.54.36.27 |05:08:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T512_U20965_M0 | | |10.54.36.33 |05:08:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T513_U20966_M0 | | |10.54.36.19 |05:08:58| |
|norm|23 | | |
0|
|SYNC_RFC |T514_U20967_M0 | | |smprd02.niladv.org |05:09:00| |
|norm|1 | | |
0|
|HTTP_NORMAL |T515_U20968_M0 | | |10.54.36.12 |05:09:02| |
|norm|6 | | |
0|
|HTTP_NORMAL |T516_U20969_M0 | | |10.54.36.30 |05:09:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T517_U20971_M0 | | |10.54.36.38 |05:09:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T518_U20974_M0 | | |10.54.36.17 |05:09:05| |
|norm|3 | | |
0|
|SYNC_RFC |T519_U20975_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T520_U20976_M0 | | |smprd02.niladv.org |05:09:06| |
|norm|1 | | |
0|
|SYNC_RFC |T521_U20977_M0 | | |smprd02.niladv.org |05:09:08| |
|norm|1 | | |
0|
|SYNC_RFC |T522_U20997_M0 | | |smprd02.niladv.org |05:09:46| |
|norm|1 | | |
0|
|HTTP_NORMAL |T523_U20998_M0 | | |10.54.36.29 |05:09:48| |
|norm|6 | | |
0|
|HTTP_NORMAL |T524_U21071_M0 | | |10.50.47.10 |05:11:53| |
|norm|3 | | |
0|
|SYNC_RFC |T525_U21069_M0 | | |smprd02.niladv.org |05:11:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T526_U21370_M0 | | |10.54.36.35 |05:21:28| |
|norm|2 | | |
0|
|SYNC_RFC |T527_U20984_M0 | | |smprd02.niladv.org |05:09:15| |
|norm|1 | | |
0|
|SYNC_RFC |T528_U20985_M0 | | |smprd02.niladv.org |05:09:17| |
|norm|1 | | |
0|
|SYNC_RFC |T529_U20986_M0 | | |smprd02.niladv.org |05:09:19| |
|norm|1 | | |
0|
|SYNC_RFC |T530_U20987_M0 | | |smprd02.niladv.org |05:09:21| |
|norm|1 | | |
0|
|SYNC_RFC |T531_U20988_M0 | | |smprd02.niladv.org |05:09:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T532_U20992_M0 | | |10.54.36.37 |05:09:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T533_U20994_M0 | | |10.54.36.26 |05:09:33| |
|norm|3 | | |
0|
|SYNC_RFC |T534_U21096_M0 | | |smprd02.niladv.org |05:12:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T535_U21000_M0 | | |10.50.47.10 |05:09:53| |
|norm|4 | | |
0|
|HTTP_NORMAL |T536_U21001_M0 | | |10.54.36.34 |05:09:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T537_U21002_M0 | | |10.54.36.25 |05:10:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T538_U21004_M0 | | |10.54.36.36 |05:10:04| |
|norm|3 | | |
0|
|SYNC_RFC |T539_U21007_M0 | | |smprd02.niladv.org |05:10:06| |
|norm|1 | | |
0|
|SYNC_RFC |T540_U21008_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T541_U21009_M0 | | |smprd02.niladv.org |05:10:08| |
|norm|1 | | |
0|
|SYNC_RFC |T542_U21010_M0 | | |smprd02.niladv.org |05:10:09| |
|norm|1 | | |
0|
|SYNC_RFC |T543_U21012_M0 | | |smprd02.niladv.org |05:10:15| |
|norm|1 | | |
0|
|SYNC_RFC |T544_U21013_M0 | | |smprd02.niladv.org |05:10:17| |
|norm|1 | | |
0|
|SYNC_RFC |T545_U21014_M0 | | |smprd02.niladv.org |05:10:19| |
|norm|1 | | |
0|
|SYNC_RFC |T546_U21015_M0 | | |smprd02.niladv.org |05:10:21| |
|norm|1 | | |
0|
|SYNC_RFC |T547_U21016_M0 | | |smprd02.niladv.org |05:10:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T548_U21020_M0 | | |10.54.36.35 |05:10:27| |
|norm|3 | | |
0|
|SYNC_RFC |T549_U21103_M0 | | |smprd02.niladv.org |05:13:01| |
|norm|1 | | |
0|
|SYNC_RFC |T550_U21049_M0 | | |smprd02.niladv.org |05:11:08| |
|norm|1 | | |
0|
|HTTP_NORMAL |T551_U21047_M0 | | |10.54.36.38 |05:11:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T552_U21048_M0 | | |10.54.36.17 |05:11:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T553_U21041_M0 | | |10.54.36.15 |05:11:00| |
|norm|3 | | |
0|
|HTTP_NORMAL |T554_U21040_M0 | | |10.54.36.33 |05:10:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T555_U21042_M0 | | |10.54.36.12 |05:11:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T556_U21043_M0 | | |10.54.36.30 |05:11:03| |
|norm|3 | | |
0|
|SYNC_RFC |T557_U21129_M0 | | |smprd02.niladv.org |05:13:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T558_U21031_M0 | | |10.54.36.29 |05:10:38| |
|norm|13 | | |
0|
|SYNC_RFC |T559_U21035_M0 | | |smprd02.niladv.org |05:10:50| |
|norm|1 | | |
0|
|HTTP_NORMAL |T560_U21036_M0 | | |10.54.36.32 |05:10:50| |
|norm|11 | | |
0|
|HTTP_NORMAL |T561_U21037_M0 | | |10.54.36.27 |05:10:50| |
|norm|4 | | |
0|
|SYNC_RFC |T562_U21050_M0 | | |smprd02.niladv.org |05:11:09| |
|norm|1 | | |
0|
|SYNC_RFC |T563_U21374_M0 | | |smprd02.niladv.org |05:21:49| |
|norm|1 | | |
0|
|SYNC_RFC |T564_U21053_M0 | | |smprd02.niladv.org |05:11:15| |
|norm|1 | | |
0|
|SYNC_RFC |T565_U21054_M0 | | |smprd02.niladv.org |05:11:17| |
|norm|1 | | |
0|
|SYNC_RFC |T566_U21055_M0 | | |smprd02.niladv.org |05:11:19| |
|norm|1 | | |
0|
|SYNC_RFC |T567_U21056_M0 | | |smprd02.niladv.org |05:11:21| |
|norm|1 | | |
0|
|SYNC_RFC |T568_U21057_M0 | | |smprd02.niladv.org |05:11:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T569_U21061_M0 | | |10.54.36.13 |05:11:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T570_U21062_M0 | | |10.54.36.11 |05:11:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T571_U21063_M0 | | |10.54.36.37 |05:11:33| |
|norm|3 | | |
0|
|SYNC_RFC |T572_U21442_M0 | | |smprd02.niladv.org |05:23:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T573_U21066_M0 | | |10.54.36.14 |05:11:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T574_U21067_M0 | | |10.54.36.41 |05:11:39| |
|norm|3 | | |
0|
|HTTP_NORMAL |T575_U21072_M0 | | |10.54.36.19 |05:11:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T576_U21073_M0 | | |10.54.36.34 |05:11:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T577_U21097_M0 | | |10.54.36.29 |05:12:49| |
|norm|3 | | |
0|
|SYNC_RFC |T578_U21075_M0 | | |smprd02.niladv.org |05:12:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T579_U21076_M0 | | |10.54.36.25 |05:12:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T580_U21079_M0 | | |10.54.36.36 |05:12:05| |
|norm|3 | | |
0|
|SYNC_RFC |T581_U21080_M0 | | |smprd02.niladv.org |05:12:08| |
|norm|1 | | |
0|
|SYNC_RFC |T582_U21081_M0 | | |smprd02.niladv.org |05:12:10| |
|norm|1 | | |
0|
|SYNC_RFC |T583_U21082_M0 | | |smprd02.niladv.org |05:12:11| |
|norm|1 | | |
0|
|SYNC_RFC |T584_U21084_M0 | | |smprd02.niladv.org |05:12:15| |
|norm|1 | | |
0|
|SYNC_RFC |T585_U21085_M0 | | |smprd02.niladv.org |05:12:17| |
|norm|1 | | |
0|
|SYNC_RFC |T586_U21086_M0 | | |smprd02.niladv.org |05:12:19| |
|norm|1 | | |
0|
|SYNC_RFC |T587_U21088_M0 | | |smprd02.niladv.org |05:12:21| |
|norm|1 | | |
0|
|SYNC_RFC |T588_U21089_M0 | | |smprd02.niladv.org |05:12:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T589_U21092_M0 | | |10.54.36.35 |05:12:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T590_U21094_M0 | | |10.54.36.28 |05:12:34| |
|norm|3 | | |
0|
|HTTP_NORMAL |T591_U21098_M0 | | |10.54.36.29 |05:12:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T592_U21099_M0 | | |10.54.36.29 |05:12:49| |
|norm|3 | | |
0|
|SYNC_RFC |T593_U21101_M0 | | |smprd02.niladv.org |05:12:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T594_U21102_M0 | | |10.54.36.33 |05:12:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T595_U21104_M0 | | |10.54.36.15 |05:13:01| |
|norm|5 | | |
0|
|HTTP_NORMAL |T596_U21105_M0 | | |10.54.36.12 |05:13:03| |
|norm|6 | | |
0|
|HTTP_NORMAL |T597_U21120_M0 | | |10.54.36.11 |05:13:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T598_U21108_M0 | | |10.54.36.30 |05:13:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T599_U21109_M0 | | |10.54.36.17 |05:13:05| |
|norm|3 | | |
0|
|SYNC_RFC |T600_U21110_M0 | | |smprd02.niladv.org |05:13:08| |
|norm|1 | | |
0|
|SYNC_RFC |T601_U21183_M0 | | |smprd02.niladv.org |05:15:48| |
|norm|1 | | |
0|
|SYNC_RFC |T602_U21113_M0 | | |smprd02.niladv.org |05:13:15| |
|norm|1 | | |
0|
|SYNC_RFC |T603_U21114_M0 | | |smprd02.niladv.org |05:13:17| |
|norm|1 | | |
0|
|SYNC_RFC |T604_U21115_M0 | | |smprd02.niladv.org |05:13:19| |
|norm|1 | | |
0|
|SYNC_RFC |T605_U21117_M0 | | |smprd02.niladv.org |05:13:21| |
|norm|1 | | |
0|
|SYNC_RFC |T606_U21118_M0 | | |smprd02.niladv.org |05:13:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T607_U21121_M0 | | |10.54.36.13 |05:13:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T608_U21122_M0 | | |10.54.36.37 |05:13:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T609_U21473_M0 | | |10.54.36.41 |05:24:41| |
|norm|2 | | |
0|
|HTTP_NORMAL |T610_U21125_M0 | | |10.54.36.14 |05:13:37| |
|norm|3 | | |
0|
|HTTP_NORMAL |T611_U21375_M0 | | |10.50.47.10 |05:21:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T612_U21130_M0 | | |10.54.36.32 |05:13:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T613_U21131_M0 | | |10.54.36.27 |05:13:50| |
|norm|4 | | |
0|
|SYNC_RFC |T614_U21133_M0 | | |smprd02.niladv.org |05:13:54| |
|norm|1 | | |
0|
|SYNC_RFC |T615_U21134_M0 | | |smprd02.niladv.org |05:13:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T616_U21135_M0 | | |10.54.36.19 |05:13:58| |
|norm|11 | | |
0|
|HTTP_NORMAL |T617_U21136_M0 | | |10.54.36.34 |05:13:59| |
|norm|6 | | |
0|
|HTTP_NORMAL |T618_U21138_M0 | | |10.54.36.38 |05:14:03| |
|norm|5 | | |
0|
|SYNC_RFC |T619_U21141_M0 | | |smprd02.niladv.org |05:14:08| |
|norm|1 | | |
0|
|SYNC_RFC |T620_U21142_M0 | | |smprd02.niladv.org |05:14:10| |
|norm|1 | | |
0|
|SYNC_RFC |T621_U21144_M0 | | |smprd02.niladv.org |05:14:15| |
|norm|1 | | |
0|
|SYNC_RFC |T622_U21145_M0 | | |smprd02.niladv.org |05:14:17| |
|norm|1 | | |
0|
|SYNC_RFC |T623_U21146_M0 | | |smprd02.niladv.org |05:14:19| |
|norm|1 | | |
0|
|SYNC_RFC |T624_U21147_M0 | | |smprd02.niladv.org |05:14:21| |
|norm|1 | | |
0|
|SYNC_RFC |T625_U21148_M0 | | |smprd02.niladv.org |05:14:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T626_U21152_M0 | | |10.54.36.35 |05:14:28| |
|norm|3 | | |
0|
|HTTP_NORMAL |T627_U21157_M0 | | |10.54.36.29 |05:14:50| |
|norm|7 | | |
0|
|HTTP_NORMAL |T628_U21159_M0 | | |10.50.47.10 |05:14:53| |
|norm|4 | | |
0|
|SYNC_RFC |T629_U21160_M0 | | |smprd02.niladv.org |05:14:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T630_U21161_M0 | | |10.54.36.36 |05:15:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T631_U21162_M0 | | |10.54.36.25 |05:15:03| |
|norm|3 | | |
0|
|SYNC_RFC |T632_U21166_M0 | | |smprd02.niladv.org |05:15:08| |
|norm|1 | | |
0|
|HTTP_NORMAL |T633_U21167_M0 | | |10.50.47.13 |05:15:12| |
|norm|6 | | |
0|
|SYNC_RFC |T634_U21169_M0 | | |smprd02.niladv.org |05:15:15| |
|norm|1 | | |
0|
|SYNC_RFC |T635_U21170_M0 | | |smprd02.niladv.org |05:15:17| |
|norm|1 | | |
0|
|SYNC_RFC |T636_U21171_M0 | | |smprd02.niladv.org |05:15:19| |
|norm|1 | | |
0|
|SYNC_RFC |T637_U21172_M0 | | |smprd02.niladv.org |05:15:21| |
|norm|1 | | |
0|
|SYNC_RFC |T638_U21173_M0 | | |smprd02.niladv.org |05:15:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T639_U21178_M0 | | |10.54.36.29 |05:15:39| |
|norm|12 | | |
0|
|HTTP_NORMAL |T640_U21310_M0 | | |10.54.36.35 |05:19:27| |
|norm|3 | | |
0|
|SYNC_RFC |T641_U21180_M0 | | |smprd02.niladv.org |05:15:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T642_U21184_M0 | | |10.54.36.27 |05:15:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T643_U21185_M0 | | |10.54.36.32 |05:15:51| |
|norm|17 | | |
0|
|SYNC_RFC |T644_U21187_M0 | | |smprd02.niladv.org |05:15:56| |
|norm|1 | | |
0|
|HTTP_NORMAL |T645_U21188_M0 | | |10.54.36.33 |05:15:57| |
|norm|3 | | |
0|
|HTTP_NORMAL |T646_U21189_M0 | | |10.54.36.34 |05:15:59| |
|norm|3 | | |
0|
|HTTP_NORMAL |T647_U21190_M0 | | |10.54.36.15 |05:16:00| |
|norm|5 | | |
0|
|HTTP_NORMAL |T648_U21191_M0 | | |10.54.36.12 |05:16:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T649_U21192_M0 | | |10.54.36.17 |05:16:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T650_U21193_M0 | | |10.54.36.30 |05:16:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T651_U21197_M0 | | |10.54.36.38 |05:16:04| |
|norm|6 | | |
0|
|SYNC_RFC |T652_U21198_M0 | | |smprd02.niladv.org |05:16:08| |
|norm|1 | | |
0|
|SYNC_RFC |T653_U21199_M0 | | |smprd02.niladv.org |05:16:10| |
|norm|1 | | |
0|
|SYNC_RFC |T654_U21201_M0 | | |smprd02.niladv.org |05:16:15| |
|norm|1 | | |
0|
|SYNC_RFC |T655_U21202_M0 | | |smprd02.niladv.org |05:16:17| |
|norm|1 | | |
0|
|SYNC_RFC |T656_U21203_M0 | | |smprd02.niladv.org |05:16:19| |
|norm|1 | | |
0|
|SYNC_RFC |T657_U21204_M0 | | |smprd02.niladv.org |05:16:21| |
|norm|1 | | |
0|
|SYNC_RFC |T658_U21205_M0 | | |smprd02.niladv.org |05:16:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T659_U21209_M0 | | |10.54.36.35 |05:16:29| |
|norm|3 | | |
0|
|HTTP_NORMAL |T660_U21210_M0 | | |10.54.36.13 |05:16:30| |
|norm|6 | | |
0|
|HTTP_NORMAL |T661_U21211_M0 | | |10.54.36.37 |05:16:32| |
|norm|5 | | |
0|
|HTTP_NORMAL |T662_U21212_M0 | | |10.54.36.26 |05:16:32| |
|norm|4 | | |
0|
|HTTP_NORMAL |T663_U21213_M0 | | |10.54.36.11 |05:16:32| |
|norm|3 | | |
0|
|SYNC_RFC |T664_U21241_M0 | | |smprd02.niladv.org |05:17:15| |
|norm|1 | | |
0|
|SYNC_RFC |T665_U21242_M0 | | |smprd02.niladv.org |05:17:17| |
|norm|1 | | |
0|
|SYNC_RFC |T666_U21238_M0 | | |smprd02.niladv.org |05:17:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T667_U21553_M0 | | |10.54.36.29 |05:27:48| |
|norm|7 | | |
0|
|HTTP_NORMAL |T668_U21233_M0 | | |10.54.36.36 |05:17:03| |
|norm|4 | | |
0|
|SYNC_RFC |T669_U21231_M0 | | |smprd02.niladv.org |05:16:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T670_U21234_M0 | | |10.54.36.25 |05:17:03| |
|norm|4 | | |
0|
|HTTP_NORMAL |T671_U21232_M0 | | |10.54.36.19 |05:16:57| |
|norm|6 | | |
0|
|HTTP_NORMAL |T672_U21223_M0 | | |10.54.36.14 |05:16:35| |
|norm|4 | | |
0|
|SYNC_RFC |T673_U21224_M0 | | |smprd02.niladv.org |05:16:41| |
|norm|1 | | |
0|
|SYNC_RFC |T674_U21228_M0 | | |smprd02.niladv.org |05:16:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T675_U21230_M0 | | |10.50.47.10 |05:16:54| |
|norm|4 | | |
0|
|SYNC_RFC |T676_U21243_M0 | | |smprd02.niladv.org |05:17:19| |
|norm|1 | | |
0|
|SYNC_RFC |T677_U21244_M0 | | |smprd02.niladv.org |05:17:21| |
|norm|1 | | |
0|
|SYNC_RFC |T678_U21245_M0 | | |smprd02.niladv.org |05:17:23| |
|norm|1 | | |
0|
|SYNC_RFC |T679_U21614_M0 | | |smprd02.niladv.org |05:29:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T680_U21251_M0 | | |10.54.36.41 |05:17:40| |
|norm|3 | | |
0|
|HTTP_NORMAL |T681_U21255_M0 | | |10.54.36.29 |05:17:49| |
|norm|6 | | |
0|
|HTTP_NORMAL |T682_U21256_M0 | | |10.54.36.29 |05:17:49| |
|norm|3 | | |
0|
|HTTP_NORMAL |T683_U21257_M0 | | |10.54.36.29 |05:17:49| |
|norm|3 | | |
0|
|SYNC_RFC |T684_U21259_M0 | | |smprd02.niladv.org |05:17:54| |
|norm|1 | | |
0|
|SYNC_RFC |T685_U21260_M0 | | |smprd02.niladv.org |05:17:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T686_U21261_M0 | | |10.54.36.33 |05:17:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T687_U21262_M0 | | |10.54.36.12 |05:18:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T688_U21263_M0 | | |10.54.36.15 |05:18:01| |
|norm|6 | | |
0|
|HTTP_NORMAL |T689_U21264_M0 | | |10.54.36.17 |05:18:02| |
|norm|3 | | |
0|
|HTTP_NORMAL |T690_U21266_M0 | | |10.54.36.30 |05:18:04| |
|norm|3 | | |
0|
|HTTP_NORMAL |T691_U21269_M0 | | |10.54.36.38 |05:18:04| |
|norm|5 | | |
0|
|SYNC_RFC |T692_U21270_M0 | | |smprd02.niladv.org |05:18:10| |
|norm|1 | | |
0|
|SYNC_RFC |T693_U21272_M0 | | |smprd02.niladv.org |05:18:15| |
|norm|1 | | |
0|
|SYNC_RFC |T694_U21273_M0 | | |smprd02.niladv.org |05:18:17| |
|norm|1 | | |
0|
|SYNC_RFC |T695_U21274_M0 | | |smprd02.niladv.org |05:18:19| |
|norm|1 | | |
0|
|SYNC_RFC |T696_U21275_M0 | | |smprd02.niladv.org |05:18:21| |
|norm|1 | | |
0|
|SYNC_RFC |T697_U21276_M0 | | |smprd02.niladv.org |05:18:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T698_U21280_M0 | | |10.54.36.13 |05:18:30| |
|norm|5 | | |
0|
|HTTP_NORMAL |T699_U21281_M0 | | |10.54.36.37 |05:18:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T700_U21282_M0 | | |10.54.36.26 |05:18:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T701_U21283_M0 | | |10.54.36.11 |05:18:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T702_U21285_M0 | | |10.54.36.14 |05:18:35| |
|norm|3 | | |
0|
|HTTP_NORMAL |T703_U21289_M0 | | |10.54.36.27 |05:18:50| |
|norm|3 | | |
0|
|HTTP_NORMAL |T704_U21290_M0 | | |10.54.36.32 |05:18:51| |
|norm|3 | | |
0|
|HTTP_NORMAL |T705_U21292_M0 | | |10.50.47.10 |05:18:54| |
|norm|3 | | |
0|
|SYNC_RFC |T706_U21293_M0 | | |smprd02.niladv.org |05:18:54| |
|norm|1 | | |
0|
|SYNC_RFC |T707_U21294_M0 | | |smprd02.niladv.org |05:18:57| |
|norm|1 | | |
0|
|HTTP_NORMAL |T708_U21295_M0 | | |10.54.36.34 |05:18:58| |
|norm|4 | | |
0|
|HTTP_NORMAL |T709_U21296_M0 | | |10.54.36.19 |05:18:58| |
|norm|16 | | |
0|
|HTTP_NORMAL |T710_U21297_M0 | | |10.54.36.25 |05:19:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T711_U21301_M0 | | |10.50.47.13 |05:19:13| |
|norm|4 | | |
0|
|SYNC_RFC |T712_U21303_M0 | | |smprd02.niladv.org |05:19:15| |
|norm|1 | | |
0|
|SYNC_RFC |T713_U21304_M0 | | |smprd02.niladv.org |05:19:17| |
|norm|1 | | |
0|
|SYNC_RFC |T714_U21305_M0 | | |smprd02.niladv.org |05:19:19| |
|norm|1 | | |
0|
|SYNC_RFC |T715_U21306_M0 | | |smprd02.niladv.org |05:19:21| |
|norm|1 | | |
0|
|SYNC_RFC |T716_U21308_M0 | | |smprd02.niladv.org |05:19:23| |
|norm|1 | | |
0|
|SYNC_RFC |T717_U21315_M0 | | |smprd02.niladv.org |05:19:48| |
|norm|1 | | |
0|
|HTTP_NORMAL |T718_U21317_M0 | | |10.54.36.12 |05:20:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T719_U21318_M0 | | |10.54.36.36 |05:20:03| |
|norm|3 | | |
0|
|HTTP_NORMAL |T720_U21322_M0 | | |10.54.36.17 |05:20:05| |
|norm|3 | | |
0|
|SYNC_RFC |T721_U21323_M0 | | |smprd02.niladv.org |05:20:10| |
|norm|1 | | |
0|
|SYNC_RFC |T722_U21325_M0 | | |smprd02.niladv.org |05:20:15| |
|norm|1 | | |
0|
|SYNC_RFC |T723_U21326_M0 | | |smprd02.niladv.org |05:20:17| |
|norm|1 | | |
0|
|SYNC_RFC |T724_U21327_M0 | | |smprd02.niladv.org |05:20:19| |
|norm|1 | | |
0|
|SYNC_RFC |T725_U21328_M0 | | |smprd02.niladv.org |05:20:21| |
|norm|1 | | |
0|
|SYNC_RFC |T726_U21330_M0 | | |smprd02.niladv.org |05:20:23| |
|norm|1 | | |
0|
|HTTP_NORMAL |T727_U21333_M0 | | |10.54.36.13 |05:20:32| |
|norm|3 | | |
0|
|HTTP_NORMAL |T728_U21334_M0 | | |10.54.36.11 |05:20:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T729_U21335_M0 | | |10.54.36.37 |05:20:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T730_U21336_M0 | | |10.54.36.28 |05:20:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T731_U21337_M0 | | |10.54.36.26 |05:20:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T732_U21339_M0 | | |10.54.36.14 |05:20:37| |
|norm|2 | | |
0|
|HTTP_NORMAL |T733_U21340_M0 | | |10.54.36.29 |05:20:39| |
|norm|10 | | |
0|
|HTTP_NORMAL |T734_U21341_M0 | | |10.54.36.41 |05:20:40| |
|norm|2 | | |
0|
|SYNC_RFC |T735_U21342_M0 | | |smprd02.niladv.org |05:20:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T736_U21346_M0 | | |10.54.36.29 |05:20:48| |
|norm|4 | | |
0|
|SYNC_RFC |T737_U21347_M0 | | |smprd02.niladv.org |05:20:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T738_U21348_M0 | | |10.54.36.27 |05:20:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T739_U21349_M0 | | |10.54.36.32 |05:20:51| |
|norm|15 | | |
0|
|HTTP_NORMAL |T740_U21351_M0 | | |10.54.36.40 |05:20:56| |
|norm|2 | | |
0|
|HTTP_NORMAL |T741_U21352_M0 | | |10.54.36.33 |05:20:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T742_U21353_M0 | | |10.54.36.34 |05:20:58| |
|norm|5 | | |
0|
|HTTP_NORMAL |T743_U21354_M0 | | |10.54.36.15 |05:21:00| |
|norm|2 | | |
0|
|HTTP_NORMAL |T744_U21355_M0 | | |10.54.36.25 |05:21:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T745_U21356_M0 | | |10.54.36.38 |05:21:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T746_U21358_M0 | | |10.54.36.30 |05:21:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T747_U21361_M0 | | |10.50.47.13 |05:21:13| |
|norm|3 | | |
0|
|SYNC_RFC |T748_U21363_M0 | | |smprd02.niladv.org |05:21:15| |
|norm|1 | | |
0|
|SYNC_RFC |T749_U21364_M0 | | |smprd02.niladv.org |05:21:17| |
|norm|1 | | |
0|
|SYNC_RFC |T750_U21365_M0 | | |smprd02.niladv.org |05:21:19| |
|norm|1 | | |
0|
|SYNC_RFC |T751_U21366_M0 | | |smprd02.niladv.org |05:21:21| |
|norm|1 | | |
0|
|SYNC_RFC |T752_U21368_M0 | | |smprd02.niladv.org |05:21:23| |
|norm|1 | | |
0|
|SYNC_RFC |T753_U21372_M0 | | |smprd02.niladv.org |05:21:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T754_U21377_M0 | | |10.54.36.19 |05:21:57| |
|norm|2 | | |
0|
|HTTP_NORMAL |T755_U21443_M0 | | |10.54.36.32 |05:23:50| |
|norm|15 | | |
0|
|SYNC_RFC |T756_U21379_M0 | | |smprd02.niladv.org |05:22:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T757_U21380_M0 | | |10.54.36.12 |05:22:02| |
|norm|2 | | |
0|
|HTTP_NORMAL |T758_U21382_M0 | | |10.54.36.36 |05:22:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T759_U21384_M0 | | |10.54.36.17 |05:22:05| |
|norm|2 | | |
0|
|SYNC_RFC |T760_U21385_M0 | | |smprd02.niladv.org |05:22:10| |
|norm|1 | | |
0|
|SYNC_RFC |T761_U21386_M0 | | |smprd02.niladv.org |05:22:11| |
|norm|1 | | |
0|
|SYNC_RFC |T762_U21388_M0 | | |smprd02.niladv.org |05:22:16| |
|norm|1 | | |
0|
|SYNC_RFC |T763_U21389_M0 | | |smprd02.niladv.org |05:22:17| |
|norm|1 | | |
0|
|SYNC_RFC |T764_U21390_M0 | | |smprd02.niladv.org |05:22:20| |
|norm|1 | | |
0|
|SYNC_RFC |T765_U21392_M0 | | |smprd02.niladv.org |05:22:22| |
|norm|1 | | |
0|
|SYNC_RFC |T766_U21394_M0 | | |smprd02.niladv.org |05:22:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T767_U21521_M0 | | |10.54.36.35 |05:26:27| |
|norm|2 | | |
0|
|SYNC_RFC |T768_U21424_M0 | | |smprd02.niladv.org |05:23:01| |
|norm|1 | | |
0|
|HTTP_NORMAL |T769_U21426_M0 | | |10.54.36.38 |05:23:04| |
|norm|2 | | |
0|
|HTTP_NORMAL |T770_U21420_M0 | | |10.54.36.33 |05:22:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T771_U21421_M0 | | |10.54.36.34 |05:22:59| |
|norm|2 | | |
0|
|SYNC_RFC |T772_U21419_M0 | | |smprd02.niladv.org |05:22:55| |
|norm|1 | | |
0|
|SYNC_RFC |T773_U21498_M0 | | |smprd02.niladv.org |05:25:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T774_U21428_M0 | | |10.54.36.30 |05:23:05| |
|norm|3 | | |
0|
|HTTP_NORMAL |T775_U21404_M0 | | |10.54.36.13 |05:22:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T776_U21405_M0 | | |10.54.36.11 |05:22:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T777_U21406_M0 | | |10.54.36.37 |05:22:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T778_U21407_M0 | | |10.54.36.26 |05:22:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T779_U21409_M0 | | |10.54.36.28 |05:22:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T780_U21479_M0 | | |10.54.36.15 |05:25:01| |
|norm|3 | | |
0|
|HTTP_NORMAL |T781_U21411_M0 | | |10.54.36.41 |05:22:40| |
|norm|2 | | |
0|
|HTTP_NORMAL |T782_U21415_M0 | | |10.54.36.29 |05:22:48| |
|norm|2 | | |
0|
|HTTP_NORMAL |T783_U21416_M0 | | |10.54.36.29 |05:22:48| |
|norm|9 | | |
0|
|HTTP_NORMAL |T784_U21417_M0 | | |10.54.36.29 |05:22:48| |
|norm|2 | | |
0|
|SYNC_RFC |T785_U21431_M0 | | |smprd02.niladv.org |05:23:16| |
|norm|1 | | |
0|
|SYNC_RFC |T786_U21432_M0 | | |smprd02.niladv.org |05:23:18| |
|norm|1 | | |
0|
|SYNC_RFC |T787_U21433_M0 | | |smprd02.niladv.org |05:23:20| |
|norm|1 | | |
0|
|SYNC_RFC |T788_U21435_M0 | | |smprd02.niladv.org |05:23:22| |
|norm|1 | | |
0|
|SYNC_RFC |T789_U21437_M0 | | |smprd02.niladv.org |05:23:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T790_U21439_M0 | | |10.54.36.35 |05:23:28| |
|norm|2 | | |
0|
|HTTP_NORMAL |T791_U21444_M0 | | |10.54.36.27 |05:23:50| |
|norm|4 | | |
0|
|HTTP_NORMAL |T792_U21445_M0 | | |10.50.47.10 |05:23:53| |
|norm|2 | | |
0|
|SYNC_RFC |T793_U21447_M0 | | |smprd02.niladv.org |05:23:55| |
|norm|1 | | |
0|
|HTTP_NORMAL |T794_U21448_M0 | | |10.54.36.19 |05:23:58| |
|norm|15 | | |
0|
|SYNC_RFC |T795_U21480_M0 | | |smprd02.niladv.org |05:25:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T796_U21451_M0 | | |10.54.36.12 |05:24:03| |
|norm|5 | | |
0|
|HTTP_NORMAL |T797_U21452_M0 | | |10.54.36.25 |05:24:03| |
|norm|2 | | |
0|
|SYNC_RFC |T798_U21453_M0 | | |smprd02.niladv.org |05:24:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T799_U21455_M0 | | |10.54.36.17 |05:24:05| |
|norm|2 | | |
0|
|SYNC_RFC |T800_U21456_M0 | | |smprd02.niladv.org |05:24:10| |
|norm|1 | | |
0|
|SYNC_RFC |T801_U21458_M0 | | |smprd02.niladv.org |05:24:15| |
|norm|1 | | |
0|
|SYNC_RFC |T802_U21459_M0 | | |smprd02.niladv.org |05:24:16| |
|norm|1 | | |
0|
|SYNC_RFC |T803_U21460_M0 | | |smprd02.niladv.org |05:24:18| |
|norm|1 | | |
0|
|SYNC_RFC |T804_U21461_M0 | | |smprd02.niladv.org |05:24:20| |
|norm|1 | | |
0|
|SYNC_RFC |T805_U21464_M0 | | |smprd02.niladv.org |05:24:22| |
|norm|1 | | |
0|
|SYNC_RFC |T806_U21466_M0 | | |smprd02.niladv.org |05:24:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T807_U21467_M0 | | |10.54.36.11 |05:24:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T808_U21468_M0 | | |10.54.36.13 |05:24:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T809_U21469_M0 | | |10.54.36.37 |05:24:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T810_U21471_M0 | | |10.54.36.14 |05:24:37| |
|norm|2 | | |
0|
|SYNC_RFC |T811_U21475_M0 | | |smprd02.niladv.org |05:24:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T812_U21476_M0 | | |10.54.36.29 |05:24:49| |
|norm|2 | | |
0|
|HTTP_NORMAL |T813_U21478_M0 | | |10.54.36.33 |05:24:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T814_U21482_M0 | | |10.54.36.36 |05:25:04| |
|norm|2 | | |
0|
|SYNC_RFC |T815_U21493_M0 | | |smprd02.niladv.org |05:25:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T816_U21485_M0 | | |10.50.47.13 |05:25:14| |
|norm|5 | | |
0|
|SYNC_RFC |T817_U21486_M0 | | |smprd02.niladv.org |05:25:15| |
|norm|1 | | |
0|
|SYNC_RFC |T818_U21487_M0 | | |smprd02.niladv.org |05:25:16| |
|norm|1 | | |
0|
|SYNC_RFC |T819_U21488_M0 | | |smprd02.niladv.org |05:25:18| |
|norm|1 | | |
0|
|SYNC_RFC |T820_U21489_M0 | | |smprd02.niladv.org |05:25:20| |
|norm|1 | | |
0|
|SYNC_RFC |T821_U21491_M0 | | |smprd02.niladv.org |05:25:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T822_U21494_M0 | | |10.54.36.28 |05:25:33| |
|norm|3 | | |
0|
|HTTP_NORMAL |T823_U21495_M0 | | |10.54.36.26 |05:25:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T824_U21497_M0 | | |10.54.36.29 |05:25:39| |
|norm|9 | | |
0|
|SYNC_RFC |T825_U21501_M0 | | |smprd02.niladv.org |05:25:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T826_U21502_M0 | | |10.50.47.10 |05:25:53| |
|norm|2 | | |
0|
|HTTP_NORMAL |T827_U21504_M0 | | |10.54.36.34 |05:25:58| |
|norm|3 | | |
0|
|HTTP_NORMAL |T828_U21505_M0 | | |10.54.36.19 |05:25:58| |
|norm|2 | | |
0|
|HTTP_NORMAL |T829_U21506_M0 | | |10.54.36.38 |05:26:02| |
|norm|4 | | |
0|
|HTTP_NORMAL |T830_U21507_M0 | | |10.54.36.30 |05:26:03| |
|norm|3 | | |
0|
|SYNC_RFC |T831_U21511_M0 | | |smprd02.niladv.org |05:26:10| |
|norm|1 | | |
0|
|SYNC_RFC |T832_U21513_M0 | | |smprd02.niladv.org |05:26:16| |
|norm|1 | | |
0|
|SYNC_RFC |T833_U21514_M0 | | |smprd02.niladv.org |05:26:18| |
|norm|1 | | |
0|
|SYNC_RFC |T834_U21515_M0 | | |smprd02.niladv.org |05:26:20| |
|norm|1 | | |
0|
|SYNC_RFC |T835_U21516_M0 | | |smprd02.niladv.org |05:26:22| |
|norm|1 | | |
0|
|HTTP_NORMAL |T836_U21546_M0 | | |10.54.36.13 |05:27:32| |
|norm|4 | | |
0|
|SYNC_RFC |T837_U21520_M0 | | |smprd02.niladv.org |05:26:24| |
|norm|1 | | |
0|
|SYNC_RFC |T838_U21523_M0 | | |smprd02.niladv.org |05:26:41| |
|norm|1 | | |
0|
|HTTP_NORMAL |T839_U21527_M0 | | |10.54.36.32 |05:26:50| |
|norm|9 | | |
0|
|HTTP_NORMAL |T840_U21528_M0 | | |10.54.36.27 |05:26:50| |
|norm|2 | | |
0|
|HTTP_NORMAL |T841_U21530_M0 | | |10.54.36.12 |05:27:02| |
|norm|5 | | |
0|
|HTTP_NORMAL |T842_U21531_M0 | | |10.54.36.25 |05:27:03| |
|norm|2 | | |
0|
|HTTP_NORMAL |T843_U21535_M0 | | |10.54.36.17 |05:27:05| |
|norm|2 | | |
0|
|SYNC_RFC |T844_U21536_M0 | | |smprd02.niladv.org |05:27:11| |
|norm|1 | | |
0|
|HTTP_NORMAL |T845_U21538_M0 | | |10.50.47.13 |05:27:14| |
|norm|2 | | |
0|
|SYNC_RFC |T846_U21539_M0 | | |smprd02.niladv.org |05:27:16| |
|norm|1 | | |
0|
|SYNC_RFC |T847_U21540_M0 | | |smprd02.niladv.org |05:27:18| |
|norm|1 | | |
0|
|SYNC_RFC |T848_U21541_M0 | | |smprd02.niladv.org |05:27:20| |
|norm|1 | | |
0|
|SYNC_RFC |T849_U21542_M0 | | |smprd02.niladv.org |05:27:22| |
|norm|1 | | |
0|
|SYNC_RFC |T850_U21545_M0 | | |smprd02.niladv.org |05:27:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T851_U21547_M0 | | |10.54.36.11 |05:27:32| |
|norm|2 | | |
0|
|HTTP_NORMAL |T852_U21548_M0 | | |10.54.36.37 |05:27:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T853_U21550_M0 | | |10.54.36.14 |05:27:37| |
|norm|2 | | |
0|
|INTERNAL |T854_U21632_M0 |000|SAPSYS | |05:30:34|4 |
|high| | | |
4200|
|HTTP_NORMAL |T855_U21617_M0 | | |10.54.36.12 |05:30:02| |
|norm|1 | | |
0|
|HTTP_NORMAL |T856_U21556_M0 | | |10.54.36.29 |05:27:49| |
|norm|2 | | |
0|
|SYNC_RFC |T857_U21558_M0 | | |smprd02.niladv.org |05:27:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T858_U21559_M0 | | |10.54.36.33 |05:27:57| |
|norm|2 | | |
0|
|SYNC_RFC |T859_U21588_M0 | | |smprd02.niladv.org |05:28:49| |
|norm|1 | | |
0|
|HTTP_NORMAL |T860_U21561_M0 | | |10.54.36.15 |05:28:01| |
|norm|4 | | |
0|
|HTTP_NORMAL |T861_U21562_M0 | | |10.54.36.36 |05:28:03| |
|norm|2 | | |
0|
|SYNC_RFC |T862_U21565_M0 | | |smprd02.niladv.org |05:28:10| |
|norm|1 | | |
0|
|SYNC_RFC |T863_U21567_M0 | | |smprd02.niladv.org |05:28:16| |
|norm|1 | | |
0|
|SYNC_RFC |T864_U21568_M0 | | |smprd02.niladv.org |05:28:18| |
|norm|1 | | |
0|
|SYNC_RFC |T865_U21569_M0 | | |smprd02.niladv.org |05:28:20| |
|norm|1 | | |
0|
|SYNC_RFC |T866_U21571_M0 | | |smprd02.niladv.org |05:28:22| |
|norm|1 | | |
0|
|SYNC_RFC |T867_U21573_M0 | | |smprd02.niladv.org |05:28:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T868_U21597_M0 | | |10.54.36.30 |05:29:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T869_U21598_M0 | | |10.54.36.38 |05:29:03| |
|norm|2 | | |
0|
|SYNC_RFC |T870_U21602_M0 | | |smprd02.niladv.org |05:29:16| |
|norm|1 | | |
0|
|HTTP_NORMAL |T871_U21595_M0 | | |10.54.36.19 |05:28:58| |
|norm|18 | | |
0|
|HTTP_NORMAL |T872_U21594_M0 | | |10.54.36.34 |05:28:58| |
|norm|1 | | |
0|
|HTTP_NORMAL |T873_U21590_M0 | | |10.54.36.32 |05:28:51| |
|norm|2 | | |
0|
|HTTP_NORMAL |T874_U21591_M0 | | |10.50.47.10 |05:28:53| |
|norm|1 | | |
0|
|SYNC_RFC |T875_U21593_M0 | | |smprd02.niladv.org |05:28:54| |
|norm|1 | | |
0|
|HTTP_NORMAL |T876_U21583_M0 | | |10.54.36.26 |05:28:33| |
|norm|2 | | |
0|
|HTTP_NORMAL |T877_U21585_M0 | | |10.54.36.28 |05:28:34| |
|norm|2 | | |
0|
|HTTP_NORMAL |T878_U21589_M0 | | |10.54.36.27 |05:28:50| |
|norm|2 | | |
0|
|SYNC_RFC |T879_U21603_M0 | | |smprd02.niladv.org |05:29:18| |
|norm|1 | | |
0|
|SYNC_RFC |T880_U21604_M0 | | |smprd02.niladv.org |05:29:20| |
|norm|1 | | |
0|
|SYNC_RFC |T881_U21606_M0 | | |smprd02.niladv.org |05:29:22| |
|norm|1 | | |
0|
|SYNC_RFC |T882_U21608_M0 | | |smprd02.niladv.org |05:29:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T883_U21610_M0 | | |10.54.36.35 |05:29:27| |
|norm|1 | | |
0|
|INTERNAL |T884_U21639_M0 | | | |05:30:40|3 |
|high| | | |
0|
|HTTP_NORMAL |T885_U21618_M0 | | |10.54.36.25 |05:30:03| |
|norm|1 | | |
0|
|HTTP_NORMAL |T886_U21621_M0 | | |10.54.36.17 |05:30:05| |
|norm|1 | | |
0|
|SYNC_RFC |T887_U21622_M0 | | |smprd02.niladv.org |05:30:10| |
|norm|1 | | |
0|
|HTTP_NORMAL |T888_U21623_M0 | | |10.50.47.13 |05:30:12| |
|norm|1 | | |
0|
|SYNC_RFC |T889_U21625_M0 | | |smprd02.niladv.org |05:30:16| |
|norm|1 | | |
0|
|SYNC_RFC |T890_U21626_M0 | | |smprd02.niladv.org |05:30:18| |
|norm|1 | | |
0|
|SYNC_RFC |T891_U21627_M0 | | |smprd02.niladv.org |05:30:20| |
|norm|1 | | |
0|
|SYNC_RFC |T892_U21629_M0 | | |smprd02.niladv.org |05:30:22| |
|norm|1 | | |
0|
|SYNC_RFC |T893_U21631_M0 | | |smprd02.niladv.org |05:30:24| |
|norm|1 | | |
0|
|HTTP_NORMAL |T894_U21633_M0 | | |10.54.36.13 |05:30:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T895_U21634_M0 | | |10.54.36.11 |05:30:32| |
|norm|1 | | |
0|
|HTTP_NORMAL |T896_U21635_M0 | | |10.54.36.37 |05:30:33| |
|norm|1 | | |
0|
|HTTP_NORMAL |T897_U21637_M0 | | |10.54.36.14 |05:30:37| |
|norm|1 | | |
0|
|HTTP_NORMAL |T898_U21638_M0 | | |10.54.36.29 |05:30:39| |
|norm|9 | | |
0|
|SYNC_RFC |T899_U21640_M0 | | |smprd02.niladv.org |05:30:41| |
|norm|1 | | |
0|

Found 899 logons with 899 sessions


Total ES (gross) memory of all sessions: 67 MB
Most ES (gross) memory allocated by T47_U9774_M0: 8 MB

Force ABAP stack dump of session T12_U20220_M0


Force ABAP stack dump of session T23_U20175_M0
Force ABAP stack dump of session T38_U20216_M0
Force ABAP stack dump of session T61_U20113_M0

Sun Sep 22 05:30:43:910 2019


Force ABAP stack dump of session T88_U20160_M0
Force ABAP stack dump of session T121_U20226_M0
Force ABAP stack dump of session T190_U20350_M0
Force ABAP stack dump of session T197_U20347_M0
Force ABAP stack dump of session T200_U20354_M0
Force ABAP stack dump of session T251_U20559_M0
Force ABAP stack dump of session T253_U20470_M0
Force ABAP stack dump of session T255_U20475_M0
Force ABAP stack dump of session T260_U20481_M0
Force ABAP stack dump of session T288_U20578_M0
Force ABAP stack dump of session T313_U20596_M0
Force ABAP stack dump of session T315_U20602_M0
Force ABAP stack dump of session T323_U20621_M0
Force ABAP stack dump of session T342_U20674_M0
Force ABAP stack dump of session T364_U20713_M0
Force ABAP stack dump of session T404_U20779_M0
Force ABAP stack dump of session T426_U20813_M0
Force ABAP stack dump of session T458_U20872_M0
Force ABAP stack dump of session T461_U20878_M0
Force ABAP stack dump of session T497_U20937_M0
Force ABAP stack dump of session T513_U20966_M0
Force ABAP stack dump of session T558_U21031_M0
Force ABAP stack dump of session T560_U21036_M0
Force ABAP stack dump of session T616_U21135_M0
Force ABAP stack dump of session T639_U21178_M0
Force ABAP stack dump of session T643_U21185_M0
Force ABAP stack dump of session T709_U21296_M0
Force ABAP stack dump of session T733_U21340_M0
Force ABAP stack dump of session T739_U21349_M0
Force ABAP stack dump of session T755_U21443_M0
Force ABAP stack dump of session T794_U21448_M0
Force ABAP stack dump of session T871_U21595_M0

RFC-Connection Table (335 entries) Sun Sep 22 05:30:43 2019


------------------------------------------------------------

|No |Conv-Id |Fi-Key |Sess-Key |State |


Type |Act. req |WP |Time |
|----|--------|--------------------------------|----------------|----------------|-
-----|----------|---|--------------------------------|
| 0|52566963|52566963SU19995_M0 |T74_U19995_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 1|52446999|52446999CU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 2|54277723|54277723SU21134_M0 |T615_U21134_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 3|52445981|52445981SU19917_M0 |T57_U19917_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:29:56 2019 |
| 4|53822060|53822060SU20926_M0 |T492_U20926_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 5|53882124|53882124SU20949_M0 |T504_U20949_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 6|54577699|54577699SU21273_M0 |T694_U21273_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 7|54205118|54205118SU21101_M0 |T593_U21101_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 8|53402068|53402068SU20338_M0 |T192_U20338_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 9|53338998|53338998SU20320_M0 |T186_U20320_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 10|54372533|54372533SU21171_M0 |T636_U21171_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 11|53813726|53813726SU20922_M0 |T489_U20922_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 12|53370125|53370125SU20716_M0 |T352_U20716_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 13|53891128|53891128SU20952_M0 |T507_U20952_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 14|54748042|54748042SU21347_M0 |T737_U21347_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 15|53005186|53005186SU20180_M0 |T45_U20180_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 16|53750985|53750985SU20896_M0 |T473_U20896_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 17|53566302|53566302SU20410_M0 |T225_U20410_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 18|54702717|54702717SU21323_M0 |T721_U21323_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 19|54366552|54366552SU21169_M0 |T634_U21169_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 20|53819053|53819053SU20924_M0 |T491_U20924_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 21|53677905|53677905SU20861_M0 |T451_U20861_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 22|53374043|53374043SU20329_M0 |T112_U20329_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 23|53812054|53812054SU20921_M0 |T488_U20921_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 24|54369536|54369536SU21170_M0 |T635_U21170_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 25|53941255|53941255SU20976_M0 |T520_U20976_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 26|54305161|54305161SU20692_M0 |T334_U20692_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 27|54586777|54586777SU21276_M0 |T697_U21276_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 28|53871130|53871130SU20946_M0 |T502_U20946_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 29|54554387|54554387SU21260_M0 |T685_U21260_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 30|54448632|54448632SU21205_M0 |T658_U21205_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 31|53533790|53533790SU20795_M0 |T415_U20795_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 32|53831465|53831465SU20509_M0 |T274_U20509_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 33|54303059|54303059SU20690_M0 |T341_U20690_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 34|53477705|53477705SU20764_M0 |T396_U20764_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 35|53807465|53807465SU20918_M0 |T486_U20918_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 36|52899468|52899468SU20140_M0 |T128_U20140_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 37|53606863|53606863SU20824_M0 |T434_U20824_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 38|53461465|53461465SU20758_M0 |T391_U20758_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 39|54714894|54714894SU21327_M0 |T724_U21327_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 40|34093756|34093756CU18046_M0 |T72_U18046_M0_I0|ALLOCATED |
CLIENT|NO_REQUEST| 5|Sat Sep 21 23:27:18 2019 |
| 41|54397087|54397087SU21180_M0 |T641_U21180_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 42|53536792|53536792SU20796_M0 |T416_U20796_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 43|53659094|53659094SU20854_M0 |T440_U20854_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 44|54184460|54184460SU20641_M0 |T331_U20641_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 45|53745004|53745004SU20892_M0 |T471_U20892_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 46|52943122|52943122SU20158_M0 |T136_U20158_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 47|53498971|53498971SU20387_M0 |T223_U20387_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 48|54172368|54172368SU21089_M0 |T588_U21089_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 49|52802166|52802166SU20107_M0 |T104_U20107_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 50|53747978|53747978SU20893_M0 |T472_U20893_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 51|54517702|54517702SU21245_M0 |T678_U21245_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 52|52446999|52446999SU19918_M0 |T69_U19918_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:56 2019 |
| 53|53816065|53816065SU20923_M0 |T490_U20923_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 54|50062580|50062580CU9773_M0 |T10_U9773_M0_I0 |ALLOCATED |
CLIENT|NO_REQUEST| 2|Sat Sep 21 00:06:16 2019 |
| 55|54739047|54739047SU21342_M0 |T735_U21342_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 56|53666302|53666302SU20856_M0 |T439_U20856_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 57|53869968|53869968SU20533_M0 |T290_U20533_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 58|54720907|54720907SU21330_M0 |T726_U21330_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 59|54653834|54653834SU21308_M0 |T716_U21308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 60|53799525|53799525SU20916_M0 |T484_U20916_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 61|54583761|54583761SU21275_M0 |T696_U21275_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 62|54199046|54199046SU21096_M0 |T534_U21096_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 63|54717900|54717900SU21328_M0 |T725_U21328_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 64|54375579|54375579SU21172_M0 |T637_U21172_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 65|54778897|54778897SU21364_M0 |T749_U21364_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 66|54580753|54580753SU21274_M0 |T695_U21274_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 67|54240969|54240969SU20669_M0 |T349_U20669_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 68|53474712|53474712SU20763_M0 |T395_U20763_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 69|53609831|53609831SU20825_M0 |T435_U20825_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 70|54213601|54213601SU21103_M0 |T549_U21103_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 71|53603861|53603861SU20823_M0 |T433_U20823_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 72|53504970|53504970SU20777_M0 |T402_U20777_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 73|53627349|53627349SU20435_M0 |T238_U20435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 74|53261602|53261602SU20285_M0 |T166_U20285_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 75|54293525|54293525SU21142_M0 |T620_U21142_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 76|53740983|53740983SU20890_M0 |T469_U20890_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 77|54775980|54775980SU21363_M0 |T748_U21363_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 78|52630037|52630037SU20015_M0 |T3_U20015_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 79|54378556|54378556SU21173_M0 |T638_U21173_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 80|54405977|54405977SU21183_M0 |T601_U21183_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 81|53539769|53539769SU20797_M0 |T417_U20797_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 82|53542786|53542786SU20798_M0 |T418_U20798_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 83|54031268|54031268SU21015_M0 |T546_U21015_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 84|54055969|54055969SU20600_M0 |T277_U20600_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 85|53753961|53753961SU20897_M0 |T474_U20897_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 86|54508676|54508676SU21242_M0 |T665_U21242_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 87|54290760|54290760SU21141_M0 |T619_U21141_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 88|34093756|34093756SU18048_M0 |T147_U18048_M0_I|ALLOCATED |
SERVER|SAP_SEND | 5|Sat Sep 21 23:27:18 2019 |
| 89|54274191|54274191SU21133_M0 |T614_U21133_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 90|53335572|53335572SU20704_M0 |T359_U20704_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 91|54574766|54574766SU21272_M0 |T693_U21272_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 92|53952146|53952146SU20984_M0 |T527_U20984_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 93|53933395|53933395SU20967_M0 |T514_U20967_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 94|52695594|52695594SU20050_M0 |T122_U20050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 95|53964177|53964177SU20988_M0 |T531_U20988_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 96|53683884|53683884SU20864_M0 |T453_U20864_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 97|54679972|54679972SU21315_M0 |T717_U21315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 98|32252616|32252616SU20586_M0 |T308_U20586_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 99|53683971|53683971SU20456_M0 |T248_U20456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 100|53884618|53884618SU20542_M0 |T279_U20542_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 101|53875871|53875871SU20535_M0 |T281_U20535_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 102|54568649|54568649SU21270_M0 |T692_U21270_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 103|53955198|53955198SU20985_M0 |T528_U20985_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 104|53961202|53961202SU20987_M0 |T530_U20987_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 105|54781967|54781967SU21365_M0 |T750_U21365_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 106|53713330|53713330SU20876_M0 |T384_U20876_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 107|54311482|54311482SU21148_M0 |T625_U21148_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 108|54009269|54009269SU21007_M0 |T539_U21007_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 109|53791976|53791976SU20908_M0 |T479_U20908_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 110|53252712|53252712SU20281_M0 |T163_U20281_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 111|53000034|53000034SU20176_M0 |T141_U20176_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 112|54241442|54241442SU21118_M0 |T606_U21118_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 113|53825033|53825033SU20927_M0 |T493_U20927_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 114|52752965|52752965SU20069_M0 |T86_U20069_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 115|53872183|53872183SU20947_M0 |T503_U20947_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 116|54544123|54544123SU21254_M0 |T411_U21254_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 117|36066368|36066368SU25456_M0 |T30_U25456_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 00:00:04 2019 |
| 118|53918034|53918034SU20961_M0 |T438_U20961_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 119|54302460|54302460SU21145_M0 |T622_U21145_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 120|54711822|54711822SU21326_M0 |T723_U21326_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 121|53450513|53450513SU20752_M0 |T388_U20752_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 122|54618765|54618765SU21293_M0 |T706_U21293_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 123|54511679|54511679SU21243_M0 |T676_U21243_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 124|54169405|54169405SU21088_M0 |T587_U21088_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 125|54308505|54308505SU21147_M0 |T624_U21147_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 126|54246530|54246530SU20673_M0 |T339_U20673_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 127|52864230|52864230SU20125_M0 |T96_U20125_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 128|53426195|53426195SU20346_M0 |T196_U20346_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 129|54268124|54268124SU21129_M0 |T557_U21129_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 130|53730608|53730608SU20888_M0 |T468_U20888_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 131|53863051|53863051SU20941_M0 |T500_U20941_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 132|54505692|54505692SU21241_M0 |T664_U21241_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 133|54022221|54022221SU21012_M0 |T543_U21012_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 134|54358834|54358834SU21166_M0 |T632_U21166_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 135|53772320|53772320SU20903_M0 |T478_U20903_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 136|54013979|54013979SU21009_M0 |T541_U21009_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 137|54345802|54345802SU21160_M0 |T629_U21160_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 138|53820547|53820547SU20507_M0 |T273_U20507_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 139|54708905|54708905SU21325_M0 |T722_U21325_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 140|54235438|54235438SU21115_M0 |T604_U21115_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 141|54644747|54644747SU21304_M0 |T713_U21304_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 142|53888093|53888093SU20951_M0 |T506_U20951_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 143|53958159|53958159SU20986_M0 |T529_U20986_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 144|51309220|51309220SU18279_M0 |T126_U18279_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 0|Sun Sep 22 04:27:16 2019 |
| 145|53615810|53615810SU20828_M0 |T437_U20828_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 146|54238481|54238481SU21117_M0 |T605_U21117_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 147|54436627|54436627SU21201_M0 |T654_U21201_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 148|51312264|51312264SU18282_M0 |T98_U18282_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 4|Sun Sep 22 04:29:08 2019 |
| 149|53338570|53338570SU20705_M0 |T360_U20705_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 150|53172244|53172244SU20253_M0 |T146_U20253_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 151|53110172|53110172SU20215_M0 |T51_U20215_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 152|54221690|54221690SU21110_M0 |T600_U21110_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 153|53674932|53674932SU20860_M0 |T450_U20860_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 154|54028234|54028234SU21014_M0 |T545_U21014_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 155|38084880|38084880SU20248_M0 |T103_U20248_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 156|53326747|53326747SU20315_M0 |T183_U20315_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 157|54093287|54093287SU21054_M0 |T565_U21054_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 158|53573044|53573044SU20809_M0 |T293_U20809_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 159|54514686|54514686SU21244_M0 |T677_U21244_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 160|53405636|53405636SU20734_M0 |T378_U20734_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 161|54650835|54650835SU21306_M0 |T715_U21306_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 162|54160337|54160337SU21084_M0 |T584_U21084_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 163|54083666|54083666SU21050_M0 |T562_U21050_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 164|54229412|54229412SU21113_M0 |T602_U21113_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 165|53591025|53591025SU20818_M0 |T430_U20818_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 166|53940202|53940202SU20975_M0 |T519_U20975_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 167|54430592|54430592SU21199_M0 |T653_U21199_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 168|53122966|53122966SU20219_M0 |T36_U20219_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 169|53671936|53671936SU20859_M0 |T449_U20859_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 170|53796298|53796298SU20500_M0 |T271_U20500_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 171|53575295|53575295SU20810_M0 |T424_U20810_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 172|54989183|54989183SU21461_M0 |T804_U21461_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 173|54500465|54500465SU21238_M0 |T666_U21238_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 174|54015616|54015616SU21010_M0 |T542_U21010_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 175|53402644|53402644SU20733_M0 |T377_U20733_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 176|54333571|54333571SU20703_M0 |T358_U20703_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 177|54330571|54330571SU20702_M0 |T357_U20702_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 178|53359275|53359275SU20714_M0 |T365_U20714_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 179|54647828|54647828SU21305_M0 |T714_U21305_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 180|53426328|53426328SU20745_M0 |T385_U20745_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 181|54427906|54427906SU21198_M0 |T652_U21198_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 182|54081559|54081559SU21049_M0 |T550_U21049_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 183|54166366|54166366SU21086_M0 |T586_U21086_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 184|54025224|54025224SU21013_M0 |T544_U21013_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 185|54150626|54150626SU21080_M0 |T581_U21080_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 186|54128970|54128970SU21069_M0 |T525_U21069_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 187|53341583|53341583SU20707_M0 |T361_U20707_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 188|54828522|54828522SU21379_M0 |T756_U21379_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 189|54967717|54967717SU21453_M0 |T798_U21453_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 190|53894111|53894111SU20953_M0 |T508_U20953_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 191|54485316|54485316SU21231_M0 |T669_U21231_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 192|54232387|54232387SU21114_M0 |T603_U21114_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 193|53933040|53933040SU20558_M0 |T75_U20558_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 194|53944437|53944437SU20977_M0 |T521_U20977_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 195|53702262|53702262SU20874_M0 |T460_U20874_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 196|54090292|54090292SU21053_M0 |T564_U21053_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 197|54550695|54550695SU21259_M0 |T684_U21259_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 198|53198536|53198536SU20262_M0 |T33_U20262_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 199|53643123|53643123SU20844_M0 |T428_U20844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 200|53612837|53612837SU20827_M0 |T436_U20827_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 201|53399648|53399648SU20732_M0 |T376_U20732_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 202|54305468|54305468SU21146_M0 |T623_U21146_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 203|52937961|52937961SU20155_M0 |T95_U20155_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 204|53438225|53438225SU20748_M0 |T256_U20748_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 205|53746045|53746045SU20474_M0 |T254_U20474_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 206|54442606|54442606SU21203_M0 |T656_U21203_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 207|54096290|54096290SU21055_M0 |T566_U21055_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 208|54034234|54034234SU21016_M0 |T547_U21016_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 209|53580137|53580137SU20812_M0 |T425_U20812_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 210|52591484|52591484SU20002_M0 |T142_U20002_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 211|55021044|55021044SU21475_M0 |T811_U21475_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 212|53519586|53519586SU20784_M0 |T408_U20784_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 213|53545747|53545747SU20799_M0 |T419_U20799_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 214|54476047|54476047SU21228_M0 |T674_U21228_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 215|53389820|53389820SU20333_M0 |T131_U20333_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 216|54141465|54141465SU20630_M0 |T328_U20630_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 217|53309969|53309969SU20303_M0 |T167_U20303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 218|54925090|54925090SU21437_M0 |T789_U21437_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 219|52634522|52634522SU20019_M0 |T24_U20019_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 220|53885142|53885142SU20950_M0 |T505_U20950_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 221|54118046|54118046SU20620_M0 |T322_U20620_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 222|54890268|54890268SU21419_M0 |T772_U21419_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 223|54641839|54641839SU21303_M0 |T712_U21303_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 224|54919111|54919111SU21433_M0 |T787_U21433_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 225|53848955|53848955SU20936_M0 |T445_U20936_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 226|54622455|54622455SU21294_M0 |T707_U21294_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 227|54012478|54012478SU21008_M0 |T540_U21008_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 228|51487902|51487902SU19196_M0 |T87_U19196_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| 1|Sun Sep 22 04:27:05 2019 |
| 229|53487258|53487258SU20382_M0 |T220_U20382_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 230|54815117|54815117SU21374_M0 |T563_U21374_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 231|54439604|54439604SU21202_M0 |T655_U21202_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 232|54299485|54299485SU21144_M0 |T621_U21144_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 233|54858049|54858049SU21394_M0 |T766_U21394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 234|54107318|54107318SU20615_M0 |T320_U20615_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 235|54099331|54099331SU21056_M0 |T567_U21056_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 236|53598228|53598228SU20821_M0 |T432_U20821_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 237|54950971|54950971SU21442_M0 |T572_U21442_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 238|53988107|53988107SU20997_M0 |T522_U20997_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 239|53468725|53468725SU20761_M0 |T393_U20761_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 240|54983185|54983185SU21459_M0 |T802_U21459_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 241|53315778|53315778SU20308_M0 |T178_U20308_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 242|53742654|53742654SU20891_M0 |T470_U20891_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 243|54913113|54913113SU21431_M0 |T785_U21431_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 244|54142537|54142537SU21075_M0 |T578_U21075_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 245|54855048|54855048SU21392_M0 |T765_U21392_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 246|50062580|50062580SU9774_M0 |T47_U9774_M0_I0 |ALLOCATED |
SERVER|NO_REQUEST| 0|Sat Sep 21 00:06:16 2019 |
| 247|54958316|54958316SU21447_M0 |T793_U21447_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 248|54414846|54414846SU21187_M0 |T644_U21187_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 249|53209464|53209464SU20267_M0 |T130_U20267_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 250|53800681|53800681SU20917_M0 |T485_U20917_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 251|54155465|54155465SU21082_M0 |T583_U21082_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 252|53938941|53938941SU20562_M0 |T298_U20562_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 253|53408645|53408645SU20736_M0 |T379_U20736_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 254|53680911|53680911SU20863_M0 |T452_U20863_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 255|53734228|53734228SU20469_M0 |T252_U20469_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 256|54445653|54445653SU21204_M0 |T657_U21204_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 257|54163344|54163344SU21085_M0 |T585_U21085_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 258|54848964|54848964SU21389_M0 |T763_U21389_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 259|53465735|53465735SU20760_M0 |T392_U20760_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 260|54916038|54916038SU21432_M0 |T786_U21432_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 261|54467123|54467123SU21224_M0 |T673_U21224_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 262|55036790|55036790SU21480_M0 |T795_U21480_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 263|54045247|54045247SU20597_M0 |T314_U20597_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 264|54787981|54787981SU21368_M0 |T752_U21368_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 265|53471695|53471695SU20762_M0 |T394_U20762_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 266|54784975|54784975SU21366_M0 |T751_U21366_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 267|54153466|54153466SU21081_M0 |T582_U21081_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 268|53522465|53522465SU20394_M0 |T218_U20394_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 269|54806114|54806114SU21372_M0 |T753_U21372_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 270|53561044|53561044SU20407_M0 |T217_U20407_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 271|53396676|53396676SU20731_M0 |T375_U20731_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 272|00116617|00116617SU22844_M0 |T118_U22844_M0_I|ALLOCATED |
SERVER|NO_REQUEST| 7|Sun Sep 22 04:27:11 2019 |
| 273|54986113|54986113SU21460_M0 |T803_U21460_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 274|54102300|54102300SU21057_M0 |T568_U21057_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 275|53185038|53185038SU20258_M0 |T155_U20258_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 276|54846049|54846049SU21388_M0 |T762_U21388_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 277|54062174|54062174SU21035_M0 |T559_U21035_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 278|54838787|54838787SU21385_M0 |T760_U21385_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 279|54981013|54981013SU21458_M0 |T801_U21458_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 280|54922085|54922085SU21435_M0 |T788_U21435_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 281|54975858|54975858SU21456_M0 |T800_U21456_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 282|53511057|53511057SU20782_M0 |T406_U20782_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 283|54840465|54840465SU21386_M0 |T761_U21386_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 284|53729452|53729452SU20887_M0 |T467_U20887_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 285|52815034|52815034SU20112_M0 |T139_U20112_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 286|54897592|54897592SU21424_M0 |T768_U21424_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 287|52580484|52580484SU19999_M0 |T97_U19999_M0_I0|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 288|54852040|54852040SU21390_M0 |T764_U21390_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 289|54992105|54992105SU21464_M0 |T805_U21464_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 290|54995162|54995162SU21466_M0 |T806_U21466_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 291|55049087|55049087SU21486_M0 |T817_U21486_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 292|55051249|55051249SU21487_M0 |T818_U21487_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 293|55054180|55054180SU21488_M0 |T819_U21488_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 294|55057257|55057257SU21489_M0 |T820_U21489_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 295|55060158|55060158SU21491_M0 |T821_U21491_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 296|55063228|55063228SU21493_M0 |T815_U21493_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 297|55081051|55081051SU21498_M0 |T773_U21498_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 298|55090083|55090083SU21501_M0 |T825_U21501_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 299|55112927|55112927SU21511_M0 |T831_U21511_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 300|55119309|55119309SU21513_M0 |T832_U21513_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 301|55122261|55122261SU21514_M0 |T833_U21514_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 302|55125318|55125318SU21515_M0 |T834_U21515_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 303|55128202|55128202SU21516_M0 |T835_U21516_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 304|55131300|55131300SU21520_M0 |T837_U21520_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 305|55149095|55149095SU21523_M0 |T838_U21523_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 306|55180465|55180465SU21536_M0 |T844_U21536_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 307|55186350|55186350SU21539_M0 |T846_U21539_M0_I|ALLOCATED |
SERVER|NO_REQUEST| -1|Thu Jan 1 01:00:00 1970 |
| 308|55189331|55189331SU21540_M0 |T847_U21540_M0_I|ALLOCATED |
SERVER

Вам также может понравиться