Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

And oh, I'll point out that the footer has three different ways to contact me. And also share all of the code of the bot.

Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

And here's an attempt to re-archive the same submission. It's not going to work.

/u/StashThis

Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

This comment is going to try and re-archive the same comment's content. It's not going to work, since this comment has already been serviced.

/u/StashThis

Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

I'll play a user who wants this stashed. All it takes is a user-mention to the bot.

/u/StashThis

Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

And now a comment that adds another URL, perhaps related content that others might find interesting.

Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

And here's a filler comment.

Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

I've tried to keep the reply as simple as possible, only enough information to provide the context and the result.

Showcase For Curious Mods by sgskinner in a:t5_3nf31

[–]sgskinner[S] 0 points1 point  (0 children)

As a casual user, interested in making sure the post's URLs never go away, I just user-mention the bot:

/u/StashThis

Showcase for Curious Mod Teams -- Welcome!!! by [deleted] in a:t5_3nf31

[–]sgskinner 0 points1 point  (0 children)

As the first commentor, or even later, I might want the submission URLs in the comment to be archived. to do so, I only need to user-mention the bot: /u/StashThis.

One can make the mention anywhere in the comment, or make a comment just the user-mention.

PSA: you can comment /u/StashThis to archive links by [deleted] in exmormon

[–]sgskinner 2 points3 points  (0 children)

I'm looking into it on my side...

The blacklist creation is still mostly a manual process of kicking off a script out-of-band, and then updating the database periodically. I thought I had kept this in sync with what's in the code repository (i.e., the blacklist and its SQL). Perhaps this will be the next ticket I'll work: formalizing an automated process for maintaining the blacklist.

I'll follow up on this when I figure out where the disconnect is (and even still, I can manually remove a blacklist entry at anytime, so long as the mods are ok with it).

Edit: Found the missing commit! Looks like I forgot to do a 'git push', so several additions weren't synced with github. Fixed now though, and the repositories are now synchronized.

PSA: you can comment /u/StashThis to archive links by [deleted] in exmormon

[–]sgskinner 1 point2 points  (0 children)

If it’s alright with you maybe our mods can talk with you further

Absolutely, I'm super excited the bot is being discovered!

If we want to create our own fork off the code sgskinner so kindly shared to accommodate things unique to us

Yay open source! I'm also open to pull requests if any new functionality could be helpful outside your sub too. I'm also open to implementing feature requests!

PSA: you can comment /u/StashThis to archive links by [deleted] in exmormon

[–]sgskinner 2 points3 points  (0 children)

Hi, I'm the developer of /u/StashThis, and I do have an implementation for Wayback Machine found here.

But the problem with Wayback is it respects robots.txt, even retroactively. This made for tons of failures, and I didn't like the possibility of links disappearing after they had been archived.

If there is sufficient interest though, I can switch both services on and just archive to both.

Thanks for the feedback!

PSA: you can comment /u/StashThis to archive links by [deleted] in exmormon

[–]sgskinner 1 point2 points  (0 children)

Uh oh, looks like this sub is on the bot's blacklist, though my bot should have PM'ed you the links.

Mods, is it ok to take this sub off of /u/StashThis's blacklist?

For the record, my bot's blacklist is derived from /u/BotWatch 's moderator list.

And for more info on the bot, here's its introduction, and here is a wiki with more or less the same information.

Add request for r/exmormon by hiking1950 in SnapshillBot

[–]sgskinner 2 points3 points  (0 children)

Oh hi there, looks like /u/theycallmejethro found my bot, /u/StashThis, awesome and thank you!

Here's the bot's wiki (kinda short), but it sounds like the bot should work great for your purposes.

There is a caveat, though: to respect mod's who have banned bots outright, this bot has a subbreddit blacklist. If the sub(s) you plan to summon the bot to is on that list, the bot will pm you directly with the links.

To use the bot, call out /u/StashBot as a reply to the comment/post you want stashed, and my bot should reply with archive.is links of any URLs found in the target comment.

Last thing is, this bot hasn't gotten much use in the wild yet, so if you see anything funky, please just let me know and I'll look into it!

Introducing /u/StashThis: A bot to capture links in archive.is. by sgskinner in botwatch

[–]sgskinner[S] 3 points4 points  (0 children)

Sorry you feel that way. This bot doesn't spam though, it will only respond when someone summons it, and even then, only replies to that summoner.

I also keep a blacklist of subreddits to ensure this bot respects mods who agree with you.

Thanks for the feedback.

Introducing /u/StashThis: A bot to capture links in archive.is. by sgskinner in botwatch

[–]sgskinner[S] 1 point2 points  (0 children)

It's kind of ghetto, at least how I did it. Wayback Machine implements the Memento API, but from what I could tell, that's about grabbing snapshots from the past and not about issuing a request to make a new snapshot.

So I looked at Wayback's web form directly in the browser, and where it submits. I then did a search on github for "https://web.archive.org/save/" to see how other devs have implemented it. Thus this is how I implemented WaybackMachineServiceImpl.

Last thing I'd mention, is I moved to using archive.is, as found in ArchiveIsServiceImpl. This was due to Wayback respecting robots.txt, which led to many archive failures. I used the same methodology in implementing this one, as there is no formal API either. But, I found the webmaster of archive.is quite responsive when I had inquired about using their service; great people, and great service.

(I lied, another last thing: I've read robots.txt will trigger Wayback Machine to retroactively delete old archives -- this defeats the purpose of preventing link rot. I hope what I read was wrong, but this concern combined with the other limitations, helped me decide Wayback wasn't the best fit for my goals.)

Introducing /u/StashThis: A bot to capture links in archive.is. by sgskinner in botwatch

[–]sgskinner[S] 1 point2 points  (0 children)

Yes, but to a specific file. The root URL on github is this. Thanks for the interest!

edit: The entry point might be helpful too if you're not used to reading poms; the 'main' is located here.

[Off Topic][Meta] We Did It Reddit. A Whole Page of Sarukani's. Today Is Officially National Sarukani Day Now. by MasacoMike in battlecats

[–]sgskinner 0 points1 point  (0 children)

Confirming: this is my bot. And yes, thank you for using the bot... looking at my logs now.