View previous topic :: View next topic |
Author |
Message |
mattmm Tux's lil' helper

Joined: 27 Feb 2004 Posts: 79
|
Posted: Mon Jan 03, 2005 7:35 pm Post subject: Monitoring MySQL |
|
|
What do you all reccomend for monitoring MySQL. I use Mytop but I need something that will send out alters or setup a GUI on a webpage so I can be alterted when MySQL crashes.
I just cut over about 1.3k additional users to our MySQL backend mailing system and it's been crashing with Max connection error. (that's a whole different issue). |
|
Back to top |
|
 |
adaptr Watchman


Joined: 06 Oct 2002 Posts: 6730 Location: Rotterdam, Netherlands
|
Posted: Mon Jan 03, 2005 8:43 pm Post subject: |
|
|
Well, phpmyadmin does its job pretty well, considering it continues to top the activity charts at sourceforge
As to the maxconnections issue - have you tried using Max-db instead ?
Info on the mysql site. _________________ >>> emerge (3 of 7) mcse/70-293 to /
Essential tools: gentoolkit eix profuse screen |
|
Back to top |
|
 |
kashani Advocate


Joined: 02 Sep 2002 Posts: 2032 Location: San Francisco
|
Posted: Tue Jan 04, 2005 6:14 am Post subject: |
|
|
Nagios works well for us, but is a bit involved if this is all you need it for. OTH if you have no monitoring system Nagios is relatively easy to get working and doing useful things.
MaxDB isn't going to help for more connections since the problem is that you're running so many threads of Postfix/Courier or whatever that open a Mysql connection to do queries on where to delivery stuff. There is no easy solution for this, but I'll toss a few ideas out there.
1. Mysql tuning. Assuming this db only does email you can play a number of tricks you wouldn't normally be able to do since your db is tiny. Bring down your buffers, increase the RAM of the server, and Mysql have 1000 connections.
2. Assuming you're going to try to handle the connections using Linux 2.6, NPTL, and recomiling glibc and Mysql will buy you better performance for that many threads.
3. replicate mysql locally to each mail server and have the mail servers talk directly to the socket instead of the TCP interface. Replication won't be hard since your mail DB is going to change that often... say something on the order of 200 update, deletes, inserts a day. And because your DB is small it's not a huge overhead for the mail server. _________________ Will personally fix your server in exchange for motorcycle related shop tools in good shape. |
|
Back to top |
|
 |
tuxmin l33t


Joined: 24 Apr 2004 Posts: 838 Location: Heidelberg
|
Posted: Tue Jan 04, 2005 8:29 am Post subject: |
|
|
If you want to check mySQL only I'd suggest writing a short script that is triggered via cron every minute:
Do a simple query and send a mail in case the connection fails.
Code: |
echo "show variables"| mysql -u user --password=xxxx || \
(echo "Failure at `date`) | mailx -s "mysql down" alert@domain
|
You would do something similiar with Nagios anyway...
Hth, Alex!!! _________________ ALT-F4 |
|
Back to top |
|
 |
mattmm Tux's lil' helper

Joined: 27 Feb 2004 Posts: 79
|
Posted: Tue Jan 04, 2005 3:47 pm Post subject: |
|
|
Ok i'll check into changing some of the buffers around and what not. Right no my my.cnf looks like this:
Code: |
[mysqld]
set-variable = max_allowed_packet=16M
set-variable = max_user_connections=1000
set-variable = max_connections=1000
set-variable = wait_timeout=60
datadir=/var/lib/mysql
socket=/var/lib/mysql/mysql.sock
[mysql.server]
user=mysql
basedir=/var/lib
[safe_mysqld]
err-log=/var/log/mysqld.log
pid-file=/var/run/mysqld/mysqld.pid
|
Even with a max connection set to 1000 it locks up with the max connection error. I have a feeling it has to do with the Perl code that the company has custom made for us. Problem is they are little to no help, and I know nothing about Perl to go in and look around. [/code] |
|
Back to top |
|
 |
|