" /> LogDB Database size - Growing, Growing, Growing - Genesys CTI User Forum

Author Topic: LogDB Database size - Growing, Growing, Growing  (Read 7747 times)

Offline pspenning

  • Jr. Member
  • **
  • Posts: 99
  • Karma: 0
    • West Interactive
LogDB Database size - Growing, Growing, Growing
« on: January 24, 2008, 07:33:59 PM »
Advertisement
Good Day Everyone,
I am experiencing a problem with my LogDB Database in which it just won't stop growing.  Can I set the size limitation / purging on this database within Genesys or must I set this within the SQL Server?

Thanks,
Perry

Offline René

  • Administrator
  • Hero Member
  • *****
  • Posts: 1832
  • Karma: 62
Re: LogDB Database size - Growing, Growing, Growing
« Reply #1 on: January 24, 2008, 10:00:49 PM »
Hi,

Genesys doesn't provide any purging tool for LogDB so it's necessary to do it on SQL Server level. Find below my SQL scripts for purging LogDB.

[code]DELETE g_log_messages
WHERE priority IN (2,3,4) AND category = 0  AND timewritten < GETDATE()-30

DELETE g_log_messages
WHERE priority IN (4,5) AND category IN (1,2) AND timewritten < GETDATE()-90

DELETE g_log_attrs
WHERE lrid NOT IN (SELECT id FROM g_log_messages)
[/code]

First query deletes all non-audit messages older 30 days. The second one deletes all audit messages older 90 days. The last one deletes the attributes related to previously deleted messages.

R.

Offline eugene

  • Jr. Member
  • **
  • Posts: 85
  • Karma: 2
Re: LogDB Database size - Growing, Growing, Growing
« Reply #2 on: January 25, 2008, 06:38:41 AM »
Rene, I believe you can also purge Log DB from SCI's log wizard.  For guys like me that only knows how to Select * on everything the wizard is quite handy.

Eugene

Kevin

  • Guest
Re: LogDB Database size - Growing, Growing, Growing
« Reply #3 on: January 25, 2008, 06:11:36 PM »
We've done something similar to Rene's suggestion, and have it in a script that runs once a week.

One caution: You may want to modify the SQL to delete the rows in chunks - maybe 1000 - 2000 at a time - so you don't run into issues with locking the table for an excessive amount of time, fill the transaction logs in too short a time, etc.

Though SCI does have the maintenance tool, (to my knowledge) it just does a straight delete, without any consideration for the impact of how many rows are getting deleted, or what the status is.

Offline zubr83

  • Newbie
  • *
  • Posts: 25
  • Karma: 0
Re: LogDB Database size - Growing, Growing, Growing
« Reply #4 on: July 09, 2008, 09:20:04 AM »
Hi all

I have the same problem... My LogDB overgrew to 40Gb and I have very low free space on server.
So, because of lack free space I can`t even make SELECT in SQL Server on this DB to view what data is in there...

Did this SQL scripts help me?
And if does, how long will take purge process on database with such size?

Kevin

  • Guest
Re: LogDB Database size - Growing, Growing, Growing
« Reply #5 on: July 09, 2008, 12:37:58 PM »
If your database is that large and you have that little space, you may have limited options:
1 - work with your DBA to create a query that will purge the logs in small chunks (about 2000 rows). If possible in your DBMS, you may want to write the query so the changes are not sent to the transaction log, since this will take space on the server. If you don't, you may run out of space on your server as the transaction log fills up.

2 - You may also try using the wizard to remove small chunks of data. For example, you might try deleting any rows that are more than 365 days old, then 300, then 240, etc. If you get more than 50000 rows, reduce the size of your query.

3 - If you don't care about the log records, or it has reached a critical state, you may want to consider just truncating the table. In many (if not all) DBMSs, this will delete all rows in the table without recording the changes in the transaction log. This is about your quickest solution; HOWEVER, THIS WILL REMOVE ALL LOG RECORDS WITHOUT THE OPTION TO RESTORE (except from a server backup).

Regardless of the path you take, engage the DBA into your solution. They may be able to assist you in the creation of the scripts, and/or have suggestions on how to approach the problem both short-term and long-term.

Once you get the logs cleared, you may also want to evaluate what is written to the logDB. How many applications send Interaction and Trace level records to the network? Do you need Interaction and Trace level records sent to the network?