URL manipulation with the support of regular expressions is very powerful. Using a pattern you can define whether the action for the real URL is to be executed or not.
Use the following syntax for lines in the action file.
Syntax
<operation> <pattern> <dest> [<option>]
The values mean the following:
Name |
Meaning and Possible Values |
---|---|
<operation> |
Request Redirect RegRedirectUrl, RegIRedirectUrl Filtering Requests RegForbiddenUrl, RegIForbiddenUrl, RegGoneUrl, RegIGoneUrl Rewriting URLs RegRewriteUrl, RegIRewriteUrl |
<pattern> |
Regular Expression (pattern) for the URL |
<dest> |
Substitution expression With this you can access the request values or the system values or you can use $n to access part string found in the search string (in brackets). If no substitution is to be made, this can be specified with "-". |
[option] [optional] |
Options for the rules (these depend on the <operation>): noescape, restart, break, compound, skip For more information, see Rewriting URLs |
Note
The regular expression is used in the URL-decoded path, the URL is not standardized. The URL decoding converts the numeric characters (%xx) into an ASCII value.
Example
The URL-decoded path
sap(bD1kZSZjPTAwMA==)/bc/bsp%3csap%20test/it00/default.htm
is
sap(bD1kZSZjPTAwMA==)/bc/bsp_sap test/it00/default.htm
Recommendation
The search result can be negated by setting a predefined exclamation mark, since the logical “not” must be formulated with regular expressions:
The line
RegIForbiddenUrl !^/sap/(.*) -
All the requests that do not begin with the prefix "/sap/" are rejected in the action file.
Caution
Note that the lines
RegIRewriteUrl ^/dtcsld(.*) /sld$1
RegForbiddenUrl /(.*) -
always return "forbidden" since all rules are applied in succession.
If you want the second rule no longer to apply if the first rule does, the first rule must be
RegIRewriteUrl ^/dtcsld(.*) /sld$1 [break]
.