Quantcast
Channel: Symantec Connect - Backup and Recovery - 讨论
Viewing all articles
Browse latest Browse all 3749

shared memory for netbackup

$
0
0
我需要解决方案

hi, i am running backup jobs with various settings. somehow, i couldn't able to submit the job with more than 2GB of shared memory buffers size.

I am using disk based storage unit (basic). number of concurrent jobs set to 24

I have set SIZE_DATA_BFFER to 256 KB (i.e 256*1024)

I couldn't set NUMBER_DATA_BUFFERS to more than 8200 (just a rought estimate). working fine with 8150 setting.

My netbackup server RAM is 256 GB

I've set shared memory at 128 GB at OS level

 /usr/openv/netbackup/bin/bpbackup -p Gen_Test -s Gen_Test -w GEN_DATA GEN_KBSIZE=2000 GEN_MAXFILES=500 GEN_PERCENT_RANDOM=100
EXIT STATUS 89: problems encountered during setup of shared memory

That means, it is not allowing me use more than 2GB (approx) of shared memory per job. here is the bptm log

16:22:22.913 [7969] <2> io_set_recvbuf: setting receive network buffer to 262144 bytes
16:22:22.913 [7969] <2> read_legacy_touch_file: Found /usr/openv/netbackup/db/config/NUMBER_DATA_BUFFERS; requested from (tmcommon.c.3525).
16:22:22.913 [7969] <2> read_legacy_touch_file: 8200 read ; requested from (tmcommon.c.3525).
16:22:22.913 [7969] <2> io_init: using 8200 data buffers
16:22:22.913 [7969] <2> io_init: child delay = 10, parent delay = 15 (milliseconds)
16:22:22.914 [7969] <16> create_shared_memory: could not allocate enough shared memory for backup buffers, Invalid argument

so, is there any maximum cap on amount of shared memory that can be used per netbackup job


Viewing all articles
Browse latest Browse all 3749

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>