Home » RDBMS Server » Server Utilities » Slow application (Linux server, Oracle DB)
Slow application [message #683754] Thu, 18 February 2021 02:00
bgraupe
Messages: 25
Registered: July 2009
Location: Windhoek Namibia
Junior Member
Good day,

I have a server/database environment which is extremely slow in performance.

I have done optimization of the sqlldr loading process. I am running the sqlldr process from a Redhat Linux server and inserting the output files produced by the sqlldr utility into and Oracle relational database. The entire process is extremely slow and I need to identify where I can improve. Somewhere in the ecosystem must be a bottleneck causing the slow performance overall. With the current loading speed I am not able to load the required number of files per day. On average I need to load 21,000 files per day. File sizes vary between 1.5M and 5M per file. Most of the files are 1.5M

I have optimized the sqlldr process based on feedback received in another Post on this forum, thus I think this issue might be optimized. I am using the conventional path load. (Cannot use direct path). Each raw file on the server is processed into one parent as well as three child tables (to be inserted into the Oracle DB).

current parameter values
sqlldr userid=xxxxx/xxxxx@xxxxx control=fgcharge.dat log=$logfile discard=$CUR_DIR$oldfilename.discard bad=$CUR_DIR$oldfilename.bad silent=feedback errors=1000000 bindsize=20000000 readsize=20000000 rows=100000

The raw files are coming in as HEX code (telecommunication ANS.1 files), my scripts encode/decode the records within each file received on the server from HEX to ASCII(this is done in milliseconds and not an issue) and write output files (One parent and 3 child tables) The sqlldr command is then used to insert the converted records (from the parent and three child tables) into the DB. On successful inserted, then I zip the original raw ANS.1 file again and move it to a separate directory (then I know the ASN.1 files was successfully inserted and dealt with.)
The information in the DB is used for reporting purposes.

I manage to load about 14,000 files per day with the above sqlldr command. 9 files per minute. To manage the number of files per day that I need to process, I have to load about 14 files per minute.

There must be a bottleneck somewhere in the process to get the files on the server, convert it from HEX to ASCII (this is very fast - milliseconds) and use sqlldr to insert the records into the Oracle database. I am not a DBA or Networks specialist. I requested all relevant information for my teams. Please see all the information in the attachment.

Any advise will be appreciated.

I will upload three separate document due to the limitation of uploaded document per reply.
A basic document with the ecosystem configurations.
A storage stats document
A network stats document

Kind regards
Bonita
Previous Topic: Import datapump issue
Next Topic: impdp execution but not statistics in target schema
Goto Forum:
  


Current Time: Thu Mar 28 12:54:50 CDT 2024