Quantcast
Channel: Symantec Connect - Backup and Recovery - 讨论
Viewing all articles
Browse latest Browse all 3749

Millions of Files - Slow Performance

$
0
0
我需要解决方案

Hi there,

 

We are currently running Netbacku 7.6.0.1 with one media / master server (30 clients only for now)

File servers with 1+ million files take forever to backup with a throughput of 5 - 9 MB / sec. The backups are going directly to basic disk and even running just one job still shows slow throughput. I have a client that has been backing up since yesterday and it's only gone through 3 million files.  The file server has 6 million files on it. Other backups (such as oracle, and smaller servers( appear fine and can acheieve a throughput of 20 MB / sec +.

I can log into the client and copy a 500 MB file directly to the disk staging unit of netbackup (E:\) and the throughput is 25 MB/sec. My question is, why is throughout so slow when backing up large file servers? Is there any way to increase this? The file server has an OS drive c:\ and a data drive d:\. Multi streaming in this case doesn't seem to improve performance so, it's turned off.

my settins for the two config files are as follows:

NUMBERS_DATA_BUFFERS_DISK = 64

SIZE_DATA_BUFFERS_DISK = 1048576

I'm considering using Netbackup Snapshot Client, but, need to research it and determine if a license is required.

 

Any advice or assistance is appreciated.

 

EDIT: I forgot to mention that differencial backups are fine and don't take long since I'm using Journaling. This issue pertains to full weekly backups.

 

1403191130

Viewing all articles
Browse latest Browse all 3749

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>