One of my customers is a software company and they use Oracle database for their product. One of the things we need to do when they certify an Oracle version is to create silent installation scripts. These scripts are for Windows and used for demo and testing environments. I did that for 11.2 and for 12.1 and now it’s 12.2’s turn.
I don’t remember having this issue in the previous release, but in 12.2 I ran into a problem that is frustrating because I don’t understand why it behaves like this, it seems to me that something wrong with the design here.
Let me explain, the issue it quite simple. I’d like to create an instance using DBCA and a template and set the memory parameters according to the physical memory of the server. However, I don’t want Oracle to set them (take X percent and split it to SGA and PGA, as it can do), I want to set them myself (for example, if the server has 8GB RAM I’d like 2GB SGA and 2GB PGA, and if it has 12GB, I’d like 4GB SGA and 4GB PGA).
The script I built (originally for 11.2) use a template with all the required settings, and I pass the memory parameters as command line arguments, like this:
dbca -silent -createDatabase -templateName “orcl.dbt” -gdbName orcl -variables -initParams SGA_TARGET=%SGA%, SGA_MAX_SIZE=%SGA%, PGA_AGGREGATE_TARGET=%PGA%
That happened in earlier versions as well (12.1 for sure, don’t remember 11.2). When I created the DBT file (running DBCA, setting all the required settings and creating a template), there is a parameter under the “MiscParams” section called customSGA and it is set to “false”. In order for the memory parameters to work it has to be “true”. This is strange as when I used DBCA to created the template I set the SGA and PGA myself, meaning I want custom memory setting (as opposed to Oracle take X percent and set the parameters itself). I don’t know why it is not reflected in the DBT.
In 12.2 changing the customSGA still didn’t work. I managed to set SGA_TARGET, but for some reason SGA_MAX_SIZE and PGA_AGGREGATE_TARGET were set to some default. Apparently there are more command line arguments we can use (an almost full list is here). In this list I found “-memoryMgmtType” that accepts AUTO, AUTO_SGA or CUSTOM_SGA. I tried setting this parameter, but got the same behavior. I opened an SR for that and the engineer told me to use a different command line argument “-automaticMemoryManagement false” to do that.
First, this argument is not in the documentation, but only in MOS note 2309280.1. Second, I don’t understand why there are 2 arguments for this, one sets the memory management type (auto or custom) and the the other disables automatic memory management…
I set the parameter, but it still didn’t work. As described in MOS note 2212857.1, you cannot set SGA_TARGET and SGA_MAX_SIZE in the dbca command line. From my experience, I managed to set SGA_TARGET, but not SGA_MAX_SIZE and not PGA_AGGREGATE_TARGET. The note says it’s like this since 12.1, but I don’t remember having this issue then, only in 12.2. Maybe I need to go and recheck that. I can’t see why this is a limitation. When I changed the parameters in the DBT file and used the “-automaticMemoryManagement” everything worked as expected.
In my opinion, when I set something in the command line, it should overwrite whatever the DBT file has and should work.
I’ll probably have to rewrite my code now so it will actually change the settings in the DBT file before executing DBCA. I hate doing this stuff in Windows…