plowshare
Plowshare
Introduction
Plowshare is a set of command-line tools (written entirely in Bash shell script) designed for managing file-sharing websites (aka Hosters).
Plowshare is divided into 6 scripts:
- plowdown, for downloading URLs
- plowup, for uploading files
- plowdel, for deleting remote files
- plowlist, for listing remote shared folders
- plowprobe, for retrieving information of downloading URLs
- plowmod, easy management (installation or update) of Plowshare modules
Plowshare itself doesn’t support any websites (named module). It’s just the core engine.
Concerning modules, few are available separately and must be installed in user directory (see below).
Features
- Small footprint (few shell scripts). No java, no python. Run fast on embedded devices.
- Few dependencies and portable. Bash and cURL are enough for most hosters.
- Modules (hoster plugins) are simple to write using Plowshare API.
- Support for automatic online captcha solver services.
- Cache mechanism: hoster session or cookie reuse (to avoid relogin).
Install
See INSTALL
file for details.
Usage examples
All scripts share the same verbose options:
-
-v0
(be quiet, alias:-q
) -
-v1
(errors only) -
-v2
(infos message; default) -
-v3
(show all messages) -
-v4
(show all messages, HTML pages and cookies, use this for bug report)
Getting help:
-
--help
-
--longhelp
(additionally prints modules command-line options)
Exhaustive documentation is available in manpages.
All examples below are using fake links.
Plowdown
Download a file from Rapidshare:
$ plowdown http://www.rapidshare.com/files/86545320/Tux-Trainer.rar
Download a file from Rapidgator using an account (free or premium):
$ plowdown -a 'myuser:mypassword' http://rapidgator.net/file/49b1b874
Note: :
is the separator character for login and password.
Enclosing string using single quotes ensure against shell expansion.
Download a list of links (one link per line):
$ cat file_with_links.txt # This is a comment http://depositfiles.com/files/abcdefghi http://www.rapidshare.com/files/86545320/Tux-Trainer_25-01-2008.rar $ plowdown file_with_links.txt
Download a list of links (one link per line) commenting out (with #
) those successfully downloaded:
$ plowdown -m file_with_links.txt
Note: Files are consecutively downloaded in the order read from input text file.
Download a file from Oron with Death by Captcha service:
$ plowdown --deathbycaptcha='user:pass' http://oron.com/dw726z0ohky5
Download a file from Rapidshare with a proxy (cURL supports http_proxy
and https_proxy
environment variables, default port is 3128
):
$ export http_proxy=http://xxx.xxx.xxx.xxx:80 $ plowdown http://www.rapidshare.com/files/86545320/Tux-Trainer.rar
Download a file with limiting the download speed (in bytes per second):
$ plowdown --max-rate 900K http://www.rapidshare.com/files/86545320/Tux-Trainer.rar
Note: Accepted prefixes are: k
, K
, Ki
, M
, m
, Mi
.
Download a file from Rapidshare (like firefox: append .part
suffix to filename while file is being downloaded):
$ plowdown --temp-rename http://www.rapidshare.com/files/86545320/Tux-Trainer.rar
Download a password-protected file from Mediafire:
$ plowdown -p 'somepassword' http://www.mediafire.com/?mt0egmhietj60iy
Note: If you don’t specify password and link requests it, you’ll be prompted (stdin) for one.
Avoid never-ending downloads: limit the number of tries (for captchas) and wait delays for each link:
$ plowdown --max-retries=4 --timeout=3600 my_big_list_file.txt
Retrieve final url (don’t use plowdown for download):
$ plowdown -q --skip-final --printf %d http://oron.com/dw726z0ohky5 | xargs wget
Note: This will not work if final url (remote host) requires a cookie. For anonynous users,
generated link has limited access in time and you can usually download file only once.
Plowup
Upload a single file anonymously to BayFiles:
$ plowup bayfiles /tmp/foo.bar
Upload a bunch of files anonymously to 2Shared (doesn’t recurse subdirectories):
$ plowup 2shared /path/myphotos/*
Note: *
is a wildcard character expanded by Bash interpreter.
Upload a file to Rapidshare with an account (premium or free)
$ plowup -a 'myuser:mypassword' rapidshare /path/xxx
Upload a file to Mirrorcreator changing remote filename:
$ plowup mirrorcreator /path/myfile.txt:anothername.txt
Note: :
is the separator character for local filename and remote filename.
Upload a file to MegaShares (anonymously) and set description:
$ plowup -d "Important document" megashares /path/myfile.tex
Upload a file to Oron anonymously with a proxy:
$ export http_proxy=http://xxx.xxx.xxx.xxx:80 $ export https_proxy=http://xxx.xxx.xxx.xxx:80 $ plowup oron /path/myfile.txt
Abort slow upload (if rate is below limit during 30 seconds):
$ plowup --min-rate 100k mediafire /path/bigfile.zip
Modify remote filenames (example: foobar.rar
gives foobar-PLOW.rar
):
$ plowup --name='%g-PLOW.%x' mirrorcreator *.rar
Remark: cURL is not capable of uploading files containing a comma ,
in their filename, but plowup will
temporarily create a symlink for you.
Use cache over sessions to avoid multiple logins:
$ plowup --cache=shared -a 'user:pasword' 1fichier file1.zip $ plowup --cache=shared 1fichier file2.zip
On first command line, login stage will be performed and session (token or cookie) will be saved in
~/.config/plowshare/storage/module-name.txt
.
On second command line, plowup will reuse the data stored to bypass login step. You don’t have to specify credentials.
Note: Only few hosters currently support cache mechanism. Have a look to
Plowshare legacy modules matrix for more information.
Custom results, print upload time, link and filename in HTML format:
$ plowup 1fichier -v0 --printf '<li>%T: <a href="https://github.com/mcrapet/%u">%f</a>%n' 5MiB.bin 10MB.bin <li>11:12:42: <a href="https://1fichier.com/?52jwehc851">5MiB.bin</a> <li>11:12:46: <a href="https://1fichier.com/?bn1jdvtpqi">10MB.bin</a>
Plowdel
Delete a file from MegaShares (delete link required):
$ plowdel http://d01.megashares.com/?dl=6EUeDtS
Delete files (deletes are successive, not parallel):
$ plowdel http://d01.megashares.com/?dl=6EUeDtS http://depositfiles.com/rmv/1643181821669253
Delete a file from Rapidshare (account is required):
$ plowdel -a myuser:mypassword http://rapidshare.com/files/293672730/foo.rar
Plowlist
List links contained in a shared folder link and download them all:
$ plowlist http://www.mediafire.com/?qouncpzfe74s9 > links.txt $ plowdown -m links.txt
List two shared folders (first link is processed, then the second one, this is not parallel):
$ plowlist http://www.mediafire.com/?qouncpzfe74s9 http://www.sendspace.com/folder/5njdw7
Remark: Some hosters are handling tree folders, you must specify -R
/--recursive
command-line switch to plowlist for enabing recursive lisiting.
List some Sendspace web folder. Render results for vBulletin BB syntax:
$ plowlist --printf '[url=%u]%f[/url]%n' http://www.sendspace.com/folder/5njdw7
List links contained in a dummy web page. Render results as HTML list:
$ plowlist --fallback --printf '<li><a href="https://github.com/mcrapet/%u">%u</a></li>%n' http://en.wikipedia.org/wiki/SI_prefix
Plowprobe
Gather public information (filename, file size, file hash, …) about a link.
No captcha solving is requested.
Filter alive links in a text file:
$ plowprobe file_with_links.txt > file_with_active_links.txt
Custom results as shell format, print links information (filename and size):
$ plowprobe --printf '#%f (%s)%n%u%n' http://myhoster.com/files/5njdw7 #foo-bar.rar (134217728) http://myhoster.com/files/5njdw7
Custom results as JSON format, print links information (filename and size):
$ plowprobe --printf '{"url":"%U","size":%s}%n' http://myhoster.com/files/5njdw7 {"url":"http://myhoster.com/files/5njdw7","size":134217728}
Custom results: print primary url (if supported by hosters and implemented by module):
$ plowprobe --printf='%v%n' http://a5ts8yt25l.1fichier.com/ https://1fichier.com/?a5ts8yt25l
Use -
argument to read from stdin:
$ plowlist...