I regularly look through old posts here to see what might need updating. Today I came across this that referenced a thread from 2003 in the comp.unix.sco.mis newsgroup.
The original text is just below and today I've added more commentary after that.
In article <d68e2226.0306090402.4849b762@posting.google.com>, JWR <no_spam_please@lycos.co.uk> wrote: >Can anyone suggest what the maximum number of characters you can >assign to a variable (including previously assigned variables), and >also if there is a limit to the number of variables that can be >assigned in one script/programme? > >For example: > >a=(first character - last character) <-- how many characters could >this be? A lot. I just did this: # ulimit -v 2000000 # j=$(dd bs=1024k count=512 if=/dev/byte/ascii/a) 512+0 records in 512+0 records out # echo ${#j} 536870912 That's a half gigabyte. I tried the same thing with a gigabyte and got a memory fault. I tried the same thing in ksh93 and it told me it was out of memory; ksh88 probably encountered the same problem but didn't catch it and so faulted. I'd guess that ksh needs two or more times the space used by variable text at some point during processing of the assignment (though possibly only for command line substitution as above). >b=(first variable - last variable) <-- how many variables could this be? I don't think there's a limit on this. John -- John DuBois spcecdt@armory.com KC6QKZ/AE https://www.armory.com/~spcecdt/
March 6th, 2011:
I did this on a 2 GB Mac OS X using bash with urandom and od -x:
#!/bin/bash x=1024 while : do j=$(od -x /dev/urandom | head -$x) echo -n $x echo $j | wc -w x=$((x * 2)) done (output) 1024 9216 2048 18432 4096 36864 8192 73728 16384 147456 32768 294912 65536 589824 131072 1179648 262144 2359296 524288 4718592 1048576 9437184 2097152 18874368 ... (hangs, system unresponsive)
(don't try that on a server or any multiuser system - you will bring it to its knees)
On a 4GB machine, it went no farther, which shows that it is NOT using available RAM. Both machines showed around 1 GB free is Activity monitor when I started these scripts.
In 2013, on a 12GB iMac, things went even longer:
1024 9216 2048 18432 4096 36864 8192 73728 16384 147456 32768 294912 65536 589824 131072 1179648 262144 2359296 524288 4718592 1048576 9437184 2097152 18874368 4194304 37748736 8388608 75497472 16777216^C (I interrupted out of boredom)
So what limits that? It isn't MAX_ARG:
$ getconf ARG_MAX
262144
According to Fedora 11 bash maximum line length?, Bash itself has no preset line length limit.
This article on kernel command line limits says it is ARG_MAX, but that can't be right.
So, is getconf wrong? Nope: /usr/include/sys/syslimits.h says:
#define ARG_MAX (256 * 1024) /* max bytes for an exec function */
This thread has comments on exec limits, but of course we aren't execing anything here. This all takes place inside the shell - no exec.
So, how far can this go, really? I tried this:
j=$(od -x /dev/urandom )
while watching in Activity Monitor. I killed the "od" process
and tried to count $j - the system RAM ran out and froze up.
So is there a limit? Apparently, though it isn't what people think. It isn't MAX_ARG and there's more RAM being used than just the bytes in the variable - a lot more.
I modified the script to add a "read ak" so that I could watch Free Memory on each loop.
The "1048576" read pushed it down to 500 MB. The next loop (which was the last successful loop) pulled it right down into single digits and it needed plenty of time to find enough RAM to finish.
Is this because of sloppy garbage collection? It could be, but if you try it with one shot of 2097152 lines, it sucks down just as much memory.. much, much more than the variable itself is holding.
An 18 MB or larger variable (70+ MB on the iMac!)is much more than any shell script is likely to need, but it does show that you need not fear assigning fairly large chunks of data if you need to.
Got something to add? Send me email.
More Articles by Anthony Lawrence © 2013-07-22 Anthony Lawrence
The only problem with the cloud is that at some point it will rain. (Reinhard Posch)
Mon Mar 7 16:43:36 2011: 9365 BigDumbDinosaur
$ getconf ARG_MAX
262144
On my Linux rig running AMD Opteron hardware, ARG_MAX = 2097152. As this is a live system, I'm not going to push it to see what happens if I load up the environment. :-)
------------------------
Printer Friendly Version
Environment variables - maximum size of a variable? Copyright © March 2011 Tony Lawrence
Have you tried Searching this site?
This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.
Contact us
Printer Friendly Version