Once sshdroid (https://play.google.com/store/apps/details?id=berserker.android.apps.sshdroid&hl=en) is installed on a rooted device, start zipping around, like in /data/user/0/<packagename>/shared_prefs
But to log on, first might need to turn off “Enable root” in sshdroid. But after logging in to: ssh root@<ip> -p 2222 then simply su.
371 total views, no views today
This is kinda cool. One way of enumerating usernames is to try a username against a login screen and have the error message tell you “That username doesn’t exist.” Or try to create a new account and have the system tell you “That username already exists.” But if a site is coded properly, it won’t give you that kind of info, making username enumeration (ie. figuring out valid and existing usernames) harder. So how about figuring them out with a timing attack?
When a username and password are submitted to a site for checking, they’re sent to a database and the dbms needs to find the username, and when it finds the row with the username, it checks the password hash against what exists in the database. However if the username doesn’t exist, the dbms doesn’t need to bother checking the password hash. It can just return the generic fail message. This small difference can be seen in the response time. In a recent test, I created a list of 50 usernames and 5 were known good. I interspersed the valid usernames in with all the invalid ones. I used the same password for every attempt, and ran them through Burp Intruder. The result was that the five good returned the slowest response times. There was one invalid password mixed in, but out of the six slowest responses, my five valid usernames were right there. Knowing this, I could do some open source searches for potential usernames and test them against a login screen. I did also test usernames of varying length and it didn’t change the results. Just in case of having a list with mostly valid usernames, I could also pad it with likely garbage usernames, things like “aaaaaaaaa” or “nekdhspfacshabdfks”. This one will be fun to try again in future assessments.
683 total views, no views today
In my first test, I worked with my manager. It was a web test and one that was pretty solid. However one fun thing was something I saw in a presentation at BSides Baltimore last week. A bad password policy may be a low finding. A lack of bad auth attempt lockout feature may be a low finding. A username enumeration may be a low finding. However, if a site has all three? That is a critical finding. If you can enumerate a list of valid usernames (just check LinkedIn for names and figure out the username format) and then throw the top 1000 passwords against a list of usernames, you’ll get in.
Some other stuff too, but also wrote the report and sent it in. Looking forward to the next one!
243 total views, no views today
I got a job as a penetration tester, which I think is really exciting. It is a job that I get excited about. One that causes frustration and a feeling of accomplishment. I’ll officially start on April 11th. My plan is to track my progress here, and document things that I learn, in general.
I contacted some other friends who are pentesters and asked for their advice, ideas on things they wish they knew when they got started. I was given two great pieces of advice on things to read or study up on. One was to read the publications on GitHub from Cure53. Today I read their whitepaper on X-Frame-Options and various ways to still bypass the clickjacking protection it provides. I’m looking forward to reading the others, once I finish the other recommendation…The Tangled Web! Continue reading
342 total views, no views today
Sometimes you hear of third party content providers getting compromised. Those are the widgets that sites use for content links. Those may be in the form of little ads or may be a “You’ll never believe what this Hollywood star did!” Sites trust those providers to load content into their site. But what happens when one of those get compromised by hackers? The hackers can then push their message or their malware onto dozens or possibly hundreds of sites all at once. Want to know more about it? I wrote a section on “Emerging Threats” for the Akamai State of the Internet Report. I’d suggest the reading the whole thing but my part starts on page 29.
186 total views, no views today