mirror of
https://gitlab.com/apparmor/apparmor.git
synced 2025-03-04 08:24:42 +01:00
Fix the processing of character escape sequences
r2456 unified escape sequence processing but it results in the \\ sequence being processed multiple times (lexer, regex conversion, backend pcre parsing). What used to happen was the lexer would only convert octal sequences and a few special escapes, \\ would be passed through the lexer and the regex conversion, thus only being handled in the pcre backend. r2456 changed that so that \\ is handled by the lexer, converting it to \, which is handled as an escape sequence in both the regex conversion and the pcre backend. This means \\001 instead of being treated as the literal \001 is treated as an octal escape sequence which is rejected by the regex conversion (it only allows for certain special chars). etc. Fix this by ensuring the lexer does not processes \\ and passes it through so it is only handled in the backend as was done in the past. Also fix front end escape sequence processing of octals etc from resulting in a later escape sequence. That is \134, \d92, .. would get converted to \ in the lexer and then treated as an escape sequence in the regex conversion or pcre processing. We fix this by converting them to the equivalent \\ sequence in the lexer and letting the backend processes it. Signed-off-by: John Johansen <john.johansen@canonical.com> Acked-by: Steve Beattie <steve@nxnw.org>
This commit is contained in:
parent
b917e30c35
commit
80cb9dd67b
1 changed files with 8 additions and 1 deletions
|
@ -459,7 +459,14 @@ char *processunquoted(const char *string, int len)
|
|||
long c;
|
||||
if (*string == '\\' && len > 1 &&
|
||||
(c = strn_escseq(&pos, "", len)) != -1) {
|
||||
*s++ = c;
|
||||
/* catch \\ or \134 and pass it through to be handled
|
||||
* by the backend pcre conversion
|
||||
*/
|
||||
if (c == '\\') {
|
||||
*s++ = '\\';
|
||||
*s++ = '\\';
|
||||
} else
|
||||
*s++ = c;
|
||||
len -= pos - string;
|
||||
string = pos;
|
||||
} else {
|
||||
|
|
Loading…
Add table
Reference in a new issue