Currently i have one Linux server that hosts about 300 clients backups, the storage size is about 1 PentaByte.
I want to share some experiences about Bacula Backups system in large deployment enviroment.
1. Use PostgresSQL instead of Mysql.
- The database size is smaller
- The Index and Reindex task finishs in less than an hour. In mysql took days, even weeks.
- The export and import of the database takes less than hour, and the exported file size of 20 Gb database is about 8 gb on disk.
- Easier to administrate, with tools like PGADMIN III
- The Bacula DBCHECK command in 20 GB database runs in less than i hour.
- Postgres has i nice feature called AUTOVACUMM
2 . Tune PosgrestSQL to your system
- You can use pgtune – http://pgfoundry.org/projects/pgtune/
- Or you tune yourself – I tunned one of the server, i can tell you that, i resolved the “Insert Attributes problem”, and other problems. Below there is an example of my tunnig postgresql.conf file, for i server with 16 gb of ram.
shared_buffers = 3840MB effective_cache_size = 4GB checkpoint_segments = 32 checkpoint_completion_target = 0.9 work_mem = 96MB maintenance_work_mem = 960MB wal_buffers = 8MB max_connections = 30
For the explanation of this setting you can search on postgresql, webpage. http://wiki.postgresql.org/wiki/Tuning_Your_PostgreSQL_Server
Please advise the above settings are its to i server with 16 gb of ram, it may change from server to server.
3 . Create this indexes on you database .
- File.PathId
- File.FilenameId
- Job.FileSetId
- Job.ClientId
4. Put “Heartbeat interval = 1 Minute” on everything, bacula-sd.conf, bacula-dir.conf, bacula-fd.conf, and on the clients bacula-fd.conf
5. Put the Bacula clients with the same version has the Bacula director Version.
If there is more tips please share.
View Comments (0)