How about this:
Run an md5sum on all files as part of a script. Have the file date and time as part of the filename. Then have the md5sum file sent to you and run diff across it. You can schedule this via a crontab and have it run every hour.
Now, what I might include in the script is an ls that will recursively list all files and directories. Have that output into a file and then use the file as an input piped to md5sum to make a sum of the files and pipe that to md5sum output which would include the date and time in the filename. I may have to check how easy this is to do, but that’s the feeling that I get about this being as simple as I think it might be.
So,
1. Ls all files and folders recursively and pipe to an output file called complete_ls.txt
2. Pipe complete_ls.txt into md5sum such that it will read the path and filename and calculate a sum which will be piped to the output file md5sum_yyyy-mm-dd_HH-MM-SS.txt
3. Run diff on previously created md5sum file against newest entry. Changes show show up immediately.
What do you guys think?
-Eric
From the Central Offices of the Technomage Guild, Security detections dept.
> On May 11, 2021, at 3:08 PM, David Schwartz via PLUG-discuss <plug-discuss@lists.phxlinux.org> wrote:
>
> I have had a shared hosting WHM (“reseller”) type account for years, and I’m constantly getting my Wordpress sites hacked.
>
> I just discovered another new WP site got hacked. I’m so sick of this. I notified my hosting provider and of course, they said they ran a scan and found nothing.
>
> It takes me just a couple of minutes to poke around using the cPanel File Manager to find litter the hacker has left. This time they added a new mailbox.
>
> I’m sick and tired of the hosting providers being so damn narrow-minded that they think scanning files looking for matching file signatures is effective. They have found exactly NONE of the files I’ve discovered over the past few years that hackers have left. NOT A SINGLE ONE!
>
> Also, as inept as they are, they do provide a lot of admin stuff I don’t want to deal with, and I do not have any interest in self-hosting on a dedicated machine (physical or virtual). It’s just a headache I don’t want to deal with.
>
> What I’d like to do is install a script or program that can scan through my file tree from …/public_html/ down and look for changes in the file system since the last scan, which is what tripwire does.
>
> Installing tripwire usually requires root access, but that’s impossible on a shared server.
>
> All it would do is something like an ‘ls -ltra ~/public_html’ with a CRC or hash of the file added to the lines. (Is there a flag in ls that does that?) The output would be saved to a file.
>
> Then you’d run a diff on that and the previous one, and send the output to an email, then delete the prvious one. Or keep them all and only compare the two latest ones. Whatever.
>
>
> As an aside, I know that Windows has a way of setting up a callback where you can get an event trigger somewhere whenever something in a designated part of the file system has changed.
>
> Is this possible in Linux?
>
> -David Schwartz
>
>
>
> ---------------------------------------------------
> PLUG-discuss mailing list - PLUG-discuss@lists.phxlinux.org
> To subscribe, unsubscribe, or to change your mail settings:
> https://lists.phxlinux.org/mailman/listinfo/plug-discuss
---------------------------------------------------
PLUG-discuss mailing list -
PLUG-discuss@lists.phxlinux.org
To subscribe, unsubscribe, or to change your mail settings:
https://lists.phxlinux.org/mailman/listinfo/plug-discuss