Analyzing the Attacks on my Website

Jeremy Morgan - Feb 8 '20 - - Dev Community

I was casually doing a security audit on my blog (JeremyMorgan.com) recently and decided to look a little deeper into my security logs. With a bit of Linux command line kung fu, some Golang, and Google sheets, I was able to get a pretty good idea of where the attacks are coming from.

To start, I'm using CentOS to host my site, so I checked out /var/log/secure. This log is where authentication logs are stored on my server.

This is what the log file looks like:

Analyzing attacks on my website

and with 301,327 lines it's not likely I'm going to manually look around much. Let's automate this a bit.

Getting the IP Address of attackers

I wanted to extract the IP address of attackers from this file. That way I can block them.

I started to mess around with Linux commands until I came up with this script.

What it does is pretty simple, it's going to look for these strings:



declare -a badstrings=("Failed password for invalid user"
                "input_userauth_request: invalid user"
                "pam_unix(sshd:auth): check pass; user unknown"
                "input_userauth_request: invalid user"
                "does not map back to the address"
                "pam_unix(sshd:auth): authentication failure"
                "input_userauth_request: invalid user"
                "reverse mapping checking getaddrinfo for"
                "input_userauth_request: invalid user"
                )


Enter fullscreen mode Exit fullscreen mode

These are strings that identify logs of failed attacks. If they put in the wrong username or tried some other form of attack, it would have one of these strings.

So we loop through that list and search for these strings, then extract an IP address from the line the string exists in.



cat /var/log/secure | grep "$i" | grep -E -o "([0-9]{1,3}[\.]){3}[0-9]{1,3}" | awk '{print $0}' | sort | uniq >> "temp.txt"


Enter fullscreen mode Exit fullscreen mode

It then dumps the IP into a temp.txt file. It will do this for all of the messages I have in my "badstrings" list.

That text file had a ton of duplicates in it, so I removed the duplicates and put only the unique IPs into a file:



# grab unique ips from temp and put them in a file
cat "temp.txt" | sort | uniq > "badguyips.txt"
# remove the temp file
rm "temp.txt"


Enter fullscreen mode Exit fullscreen mode

Cool, now I have a list of IP addresses ready to go.

Analyzing attacks on my website

Yikes, I have 1,141 IP addresses here.

Blocking Them

Now I want to block these IP addresses. Since I'm running iptables, I can just drop them with this simple script:



#!/bin/bash
input="badguyips.txt"
while IFS= read -r line
do
  iptables -A INPUT -s $line -j DROP
done < "$input"

service iptables save


Enter fullscreen mode Exit fullscreen mode

Cool. Now the attackers blocked from my server.

Then I got curious. Where the heck are these attacks coming from?

Getting their Location Data

Since I have a list of IP addresses, I thought I'd run them against a database like Maxmind to find some location information. So I did just that.

I wrote this Golang program called "find the bad guys" that would go through the text file of IP addresses, and look up their location information, then write it to a series of text files.

I wrote out locations based on:

  • Continent
  • Countries
  • Cities
  • Subdivisions of Cities

I wanted to see where the attacks are coming from and share that information. so I ran the program I built, and now have some helpful lists of location information:

Analyzing attacks on my website

Continents

So now I want to take a look at continents.txt.

Analyzing attacks on my website

Well, that's going to be a problem, there are some duplicates.

I can run a quick command and get unique values:



cat continents | sort | uniq


Enter fullscreen mode Exit fullscreen mode

The results should come as no surprise if you've ever looked at a globe:

Analyzing attacks on my website

But I want to see how many attacks from each continent. So I call on my old friend uniq for that:



awk -F '\n' '{print $0}' continents.txt | sort | uniq -c


Enter fullscreen mode Exit fullscreen mode

Analyzing attacks on my website

Pretty sweet, right? So I'll remove the leading spaces, insert a comma after the count and drop it into a text file.



awk -F '\n' '{print $0}' continents.txt | sort | uniq -c | awk '{$1=$1};1' | sed -r 's/\s+/,/'  > contintent-totals.txt


Enter fullscreen mode Exit fullscreen mode

Analyzing attacks on my website

Now I can drop it into Google sheets.

Analyzing attacks on my website

and get this nice chart:

Analyzing attacks on my website

This is the process I repeat for the other locations (country, city, subdivision), so I won't repeat it. So here are my results:

Countries

Analyzing attacks on my website

Here are the top 10 countries attackers are coming from:

  • China (304)
  • United States (138)
  • France (95)
  • India (46)
  • Singapore (43)
  • South Korea (38)
  • Germany (37)
  • Russia (37)
  • Brazil (35)
  • United Kingdom (29)

Cities

Analyzing attacks on my website

Attacks per city are a little more aggregated.

  • Beijing (57)
  • Shanghai (53)
  • Hefei (25)
  • Amsterdam (21)
  • Bengaluru (16)
  • London (14)
  • Xinpu (14)
  • Clifton (10)
  • North Bergen (9)

But still pretty interesting.

Subdivisions

Analyzing attacks on my website

This one is aggregated even more. But it drills down a bit more. Here are the top 10 subdivisions attackers are coming from:

  • Beijing (145)
  • Shanghai (61)
  • Anhui (26)
  • England (22)
  • Jiangsu (22)
  • New Jersey (22)
  • North Holland (22)
  • California (18)
  • Sao Paulo (18)
  • Karnataka (16)

Conclusion

Great things always come from curiosity. I'm curious about what other kinds of patterns and data I can extract from this, so I'm going to keep experimenting and playing with it.

If you decide you want to do this for your website, try it these steps, and Let me know if you need any help with it.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .