COMMAND
bash
SYSTEMS AFFECTED
Those running bash as their shell
PROBLEM
Wojciech Purczynski found following. BASH wrongly allocates
memory for lines read from redirected standard input. If you use
CMD << _EOF_WORD_ operator to redirect standard input BASH will
read following lines from the command input (either tty or shell
script) into dynamically allocated memory until it encounters
_EOF_WORD_. The BASH allocates only 1000 bytes for first line
regardless of line length. Looked at the source code you will
notice following in 'make_cmd.c':
if (len + document_index >= document_size)
{
document_size = document_size ? 2 * (document_size + len)
: 1000; /* XXX */
document = xrealloc (document, document_size);
}
So, if we type a line longer than 1000 characters the BASH will
exit with a reason like 'Segmentation fault (core dumped)'. Here
is an example script:
#!/bin/bash
cat << _EOF_
_here_should_be_line_longer_than_1000_bytes________
_EOF_
It is beleived that all versions up to 1.14.7 have this bug.
SOLUTION
Just replace '1000' with '1000+len' and everything should be OK.
Patch:
--- start of bash-1.14.7-redir.patch ---
--- make_cmd.c Fri Jul 1 01:15:03 1994
+++ make_cmd.c.redir Mon Apr 5 22:33:43 1999
@@ -424,7 +424,7 @@
if (len + document_index >= document_size)
{
document_size = document_size ? 2 * (document_size
+ len)
- : 1000; /* XXX */
+ : 1000+len; /* much better,huh? */
document = xrealloc (document, document_size);
}
Besides, this was fixed a long time ago, with bash-2.02.1.