git clone https://code.google.com/p/plowshare/
make
===== plowshare examples =====
==== plowdown examples ====
* Download a file from RapidShare:
plowdown http://www.rapidshare.com/files/86545320/Tux-Trainer_25-01-2008.rar
* Download a file from HotFile using an account (free or premium):
plowdown -a myuser:mypassword http://hotfile.com/dl/68261330/2f2926f/
Note: Don't forget to simple quote if your credentials have got characters that bash can interpret. For example: 'matt:foo$bar' or 'matt:foo+ -bar'.
* Download a file from Oron with Antigate.com service (feature added since 2012.02.01):
plowdown --antigate=key http://oron.com/dw726z0ohky5
* Download a file from Oron with Death by Captcha service (feature added since 2012.05.04):
plowdown --deathbycaptcha='user:pass' http://oron.com/dw726z0ohky5
* Download a file from RapidShare with a proxy. curl supports http_proxy and https_proxy environment variables (notice that 3128 is the default port).
export http_proxy=http://xxx.xxx.xxx.xxx:80
plowdown http://www.rapidshare.com/files/86545320/Tux-Trainer_25-01-2008.rar
* Download a list of links (one link per line):
cat file_with_links.txt
# This is a comment
http://depositfiles.com/files/abcdefghi
http://www.rapidshare.com/files/86545320/Tux-Trainer_25-01-2008.rar
plowdown file_with_links.txt
* Download a list of links (one link per line) commenting out (with #) those successfully downloaded:
plowdown -m file_with_links.txt
* Limit the download rate (in bytes per second). Accepted prefixes are k, K, Ki, M, m, Mi:
plowdown --max-rate 900K http://www.rapidshare.com/files/86545320/Tux-Trainer_25-01-2008.rar
* Download a password-protected link from Mediafire:
plowdown -p somepassword http://www.mediafire.com/?mt0egmhietj60iy
* Avoid never-ending downloads: limit the number of tries (for captchas) and wait delays for each link:
plowdown --max-retries=20 --timeout=3600 ...
==== plowup examples ====
Upload
* Upload a file to your RapidShare account:
plowup --auth=myuser:mypassword rapidshare /path/myfile.txt
* Upload a file to RapidShare anonymously changing uploaded file name:
plowup rapidshare /path/myfile.txt:anothername.txt
* Upload a file to TurboBit with an account (premium or free):
plowup -a myuser:mypassword turbobit /path/xxx
* Upload a bunch of files (anonymously to 2Shared):
plowup 2shared /path/myphotos/*
Notice that only files will be sent, subdirectories will be ignored.
* Upload a file to megashares (anonymously) + set description
plowup -d 'Important document' megashares /path/myfile.tex
* Upload a file to Zshare anonymously with a proxy.
export http_proxy=http://xxx.xxx.xxx.xxx:80
export https_proxy=http://xxx.xxx.xxx.xxx:80
plowup zshare /path/myfile.txt
* Abort slow upload (if rate is below limit during 30 seconds)
plowup --min-rate 100k mediafire /path/bigfile.zip
* Modify remote filenames (example: foobar.rar gives foobar-PLOW.rar)
plowup --name='%g-PLOW.%x' mirrorcreator *.rar
==== plowlist examples ====
* List links contained in a shared folder link and download them all:
plowlist http://www.mediafire.com/?qouncpzfe74s9 > links.txt
plowdown -m links.txt
Some hosters are handling tree folders, you must specify -R/--recursive command-line switch to plowlist for enabing recursive lisiting.
* List some sendspace.com web folder. Render results for vBulletin "BB" syntax.
plowlist --printf '[url=%u]%f[/url]%n' http://www.sendspace.com/folder/5njdw7
* List links contained in a dummy web page. Render results as HTML list:
plowlist --fallback --printf '%u %n' http://en.wikipedia.org/wiki/SI_prefix
==== real examples for mediafire ====
=== prepare config for mediafire ===
* Disable ssl mode
{{:linux:mediafire-ssl.png|}}
* share mediafire for everyone
{{:linux:mediafire-share.png|}}
* config create app for developer to upload file
{{:linux:mediafire-developer.png|}}
=== plowshare commands ===
* plownlist
plowlist http://www.mediafire.com/folder/cirj9u226cn3d/softs
Retrieving list (mediafire): http://www.mediafire.com/folder/cirj9u226cn3d/softs
# AOE2.zip
http://www.mediafire.com/?cm8jl9p668vomba
# avast_free_antivirus_setup.exe
http://www.mediafire.com/?4nrlt3bpf8i69yz
# Beyond Compare 3.1.3 Build 10374 + Serial Key.rar
http://www.mediafire.com/?vcifnqbyg0t5dn2
# dropbox-2-4-6.zip
http://www.mediafire.com/?e6133i8i3j5i8w8
# FshareSetup_4.7.0.exe
http://www.mediafire.com/?emtdd7lei7dqbhh
# Linux System Administration.pdf
http://www.mediafire.com/?n1i6zi2r74czy58
# MediaFireDesktop-0.10.50.9468-windows-PRODUCTION.exe
http://www.mediafire.com/?p9qqqkx1ikh66c1
# navicat8_mysql_en.zip
http://www.mediafire.com/?cntytp9ja9y642d
# OneDriveSetup.exe
http://www.mediafire.com/?os9abu5gh9qhjte
# Sparx Enterprise Architect v9.0.0.908.rar
http://www.mediafire.com/?59gu9hygmrx2pjz
* plowndown
* plownup
===== Basic knowlege =====
==== Using basic commands(non script) from plowshare source ====
=== debug plowlist ===
debug below command:
bash -x /usr/local/bin/plowlist http://www.mediafire.com/folder/cirj9u226cn3d/softs
=> Check curl and php call
* command 1:
curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent 'Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0' --silent -d folder_key=cirj9u226cn3d http://www.mediafire.com/api/folder/get_info.php
folder/get_info
cirj9u226cn3d
softs
2014-05-27 20:55:54
public
10
2
294
Anh Vo
http://www.mediafire.com/images/icons/myfiles/default.png
0
Success
2.14
* command 2:
curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent 'Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0' --silent -d folder_key=cirj9u226cn3d -d content_type=files -d chunk=1 http://www.mediafire.com/api/folder/get_content.php
folder/get_content
100
files
1
cm8jl9p668vomba
4d737c99c02ac5f3b81b95ec32c2d81f099c1b91f7c505f9cebad2a01f0a7598
AOE2.zip
382775345
public
2014-05-29 02:35:59
no
application/zip
archive
0
0
275
0
http://www.mediafire.com/download/cm8jl9p668vomba/AOE2.zip
....................
Success
2.14
=== debug plowdown ===
debug below command:
bash -x /usr/local/bin/plowdown http://www.mediafire.com/?os9abu5gh9qhjte
=> Check curl and php call
* command 1:
curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent 'Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0' --silent --head 'http://www.mediafire.com/?os9abu5gh9qhjte'
HTTP/1.1 301
Date: Fri, 30 May 2014 04:20:28 GMT
Content-Type: text/html; charset=utf-8
Connection: close
Cache-control: no-cache
Expires: 0
Location: /download/os9abu5gh9qhjte/OneDriveSetup.exe
Pragma: no-cache
Set-Cookie: ukey=ue25ueitucwb8fbolgu8dpn869nur89o; expires=Fri, 29-Apr-2016 04:20:28 GMT; path=/; domain=.mediafire.com; httponly
Server: MediaFire
Access-Control-Allow-Origin: *
* command 2:
curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent 'Mozilla/5.0 (X11; Linux x86_64; rv:6.0) Gecko/20100101 Firefox/6.0' --silent -b /tmp/plowdown.23305.18589 -c /tmp/plowdown.23305.18589 http://www.mediafire.com/download/os9abu5gh9qhjte/OneDriveSetup.exe
==== Using basic commands base on mediafire API ====
refer: http://www.mediafire.com/developers/
=== Get Login Token ===
curl -k "https://www.mediafire.com/api/user/get_login_token.php?email=itanhchi@yahoo.com&password=8941362&application_id=41323&signature=8d71ce0791d93d9192800b20b1b5aceb486534c2&version=2"
7pbzcy6sm6hnzbfgdxxtt5nn610a1k8k4j2i9t5f11f6a99rikcpt19zpaq14ta6
=== Get Session Token ===
curl -k "https://www.mediafire.com/api/user/get_session_token.php?email=itanhchi@yahoo.com&password=xxxxxxx&application_id=41323&signature=8d71ce0791d93d9192800b20b1b5aceb486534c2&version=2"
user/get_session_token
b781767140c809f2c7fea45275161504ac29a545ee4d451d4e25f075de67c45f93b22945df3eaf005fa92436ca80a496eab36332f5d401a84a6e31e2448c02606da9c23fce60a5bb
5c83209e7f
Success
2.14
=== Get mediafire setting ===
curl "http://www.mediafire.com/api/user/get_settings.php?session_token=b781767140c809f2c7fea45275161504ac29a545ee4d451d4e25f075de67c45f93b22945df3eaf005fa92436ca80a496eab36332f5d401a84a6e31e2448c02606da9c23fce60a5bb"
user/get_settings
21474836580
21474836580
yes
yes
no
yes
no
3192776282
12886999040
no
10
inherit
Success
2.14
=== Get folder information ===
curl "http://www.mediafire.com/api/folder/get_content.php?folder_key=cirj9u226cn3d&session_token=b781767140c809f2c7fea45275161504ac29a545ee4d451d4e25f075de67c45f93b22945df3eaf005fa92436ca80a496eab36332f5d401a84a6e31e2448c02606da9c23fce60a5bb&content_type=folders"
folder/get_content
100
folders
1
obj7d1qatoaop
designer
public
2014-05-28 18:41:34
266
2
15
0
no
Success
2.14
=== Get list files in folder ===
curl "http://www.mediafire.com/api/folder/get_content.php?folder_key=cirj9u226cn3d&session_token=b781767140c809f2c7fea45275161504ac29a545ee4d451d4e25f075de67c45f93b22945df3eaf005fa92436ca80a496eab36332f5d401a84a6e31e2448c02606da9c23fce60a5bb&content_type=files"
folder/get_content
100
files
1
4nrlt3bpf8i69yz
ed34cb6a33372502ef61aa949fc58fe643a1d8cf830e2a455ab4bcb49759ceda
avast_free_antivirus_setup.exe
94714880
public
2014-05-27 20:53:58
no
application/x-dosexec
application
0
0
271
2
0
0
http://www.mediafire.com/download/4nrlt3bpf8i69yz/avast_free_antivirus_setup.exe
vcifnqbyg0t5dn2
0b7c1953ef9aa4e396dfef944b392d7ae094ffb781c121b89d0e19ee01c6eb8b
Beyond Compare 3.1.3 Build 10374 + Serial Key.rar
5680787
public
2012-05-20 21:05:56
no
application/x-rar
archive
0
0
289
2
1
0
http://www.mediafire.com/download/vcifnqbyg0t5dn2/Beyond_Compare_3.1.3_Build_10374_+_Serial_Key.rar
.................
Success
2.14
=== Get direct download link base on quickey ===
curl "http://www.mediafire.com/api/file/get_links.php?link_type=direct_download&session_token=b781767140c809f2c7fea45275161504ac29a545ee4d451d4e25f075de67c45f93b22945df3eaf005fa92436ca80a496eab36332f5d401a84a6e31e2448c02606da9c23fce60a5bb&quick_key=59gu9hygmrx2pjz&response_format=xml"
file/get_links
59gu9hygmrx2pjz
http://download646.mediafire.com/8fhslcdsebjg/59gu9hygmrx2pjz/Sparx+Enterprise+Architect+v9.0.0.908.rar
46
Success
2.14
=== add web upload ===
curl "http://www.mediafire.com/api/upload/add_web_upload.php?session_token=b781767140c809f2c7fea45275161504ac29a545ee4d451d4e25f075de67c45f93b22945df3eaf005fa92436ca80a496eab36332f5d401a84a6e31e2448c02606da9c23fce60a5bb&url=http%3A%2F%2Fvn.easyvn.biz%2Ffiles%2Fsoftware%2F2013%2F08%2Facdsee-pro-6-3-build-221.zip&filename=acdsee-pro-6-3-build-221.zip"
upload/add_web_upload
sxs36umq1d
Success
2.14
==== make file for install ====
===== plowshare core =====
==== post_login ====
post_login() {
local -r AUTH=$1
local -r COOKIE=$2
local -r POSTDATA=$3
local -r LOGIN_URL=$4
shift 4
local -a CURL_ARGS=("$@")
local USER PASSWORD DATA RESULT
if [ -z "$AUTH" ]; then
log_error "$FUNCNAME: authentication string is empty"
return $ERR_LOGIN_FAILED
fi
if [ -z "$COOKIE" ]; then
log_error "$FUNCNAME: cookie file expected"
return $ERR_LOGIN_FAILED
fi
# Seem faster than
# IFS=":" read -r USER PASSWORD <<< "$AUTH"
USER=$(echo "${AUTH%%:*}" | uri_encode_strict)
PASSWORD=$(echo "${AUTH#*:}" | uri_encode_strict)
if [ -z "$PASSWORD" -o "$AUTH" = "${AUTH#*:}" ]; then
PASSWORD=$(prompt_for_password) || true
fi
log_notice "Starting login process: $USER/${PASSWORD//?/*}"
DATA=$(eval echo "${POSTDATA//&/\\&}")
RESULT=$(curl --cookie-jar "$COOKIE" --data "$DATA" "${CURL_ARGS[@]}" \
"$LOGIN_URL") || return
# "$RESULT" can be empty, this is not necessarily an error
if [ ! -s "$COOKIE" ]; then
log_debug "$FUNCNAME: no entry was set (empty cookie file)"
return $ERR_LOGIN_FAILED
fi
log_report '=== COOKIE BEGIN ==='
logcat_report "$COOKIE"
log_report '=== COOKIE END ==='
if ! find_in_array CURL_ARGS[@] '-o' '--output'; then
echo "$RESULT"
fi
}
==== parse ====
# Get lines that match filter+parse regular expressions and extract string from it.
#
# $1: regexp to filter (take lines matching $1 pattern; "." or "" disable filtering).
# $2: regexp to parse (must contain parentheses to capture text). Example: "url:'\(http.*\)'"
# $3: (optional) how many lines to skip (default is 0: filter and match regexp on same line).
# Note: $3 may only be used if line filtering is active ($1 != ".")
# Example ($3=1): get lines matching filter regexp, then apply parse regexp on the line after.
# Example ($3=-1): get lines matching filter regexp, then apply parse regexp on the line before.
# stdin: text data
# stdout: result
parse_all() {
local PARSE=$2
local -i N=${3:-0}
local -r D=$'\001' # Change sed separator to allow '/' characters in regexp
local STRING FILTER
if [ -n "$1" -a "$1" != '.' ]; then
FILTER="\\${D}$1${D}" # /$1/
else
[ $N -eq 0 ] || return $ERR_FATAL
fi
[ '^' = "${PARSE:0:1}" ] || PARSE="^.*$PARSE"
[ '$' = "${PARSE:(-1):1}" ] || PARSE+='.*$'
PARSE="s${D}$PARSE${D}\1${D}p" # s/$PARSE/\1/p
if [ $N -eq 0 ]; then
# STRING=$(sed -ne "/$1/ s/$2/\1/p")
STRING=$(sed -ne "$FILTER $PARSE")
elif [ $N -eq 1 ]; then
# Note: Loop (with label) is required for consecutive matches
# STRING=$(sed -ne ":a /$1/ {n;h; s/$2/\1/p; g;ba;}")
STRING=$(sed -ne ":a $FILTER {n;h; $PARSE; g;ba;}")
elif [ $N -eq -1 ]; then
# STRING=$(sed -ne "/$1/ {x; s/$2/\1/p; b;}" -e 'h')
STRING=$(sed -ne "$FILTER {x; $PARSE; b;}" -e 'h')
else
local -r FIRST_LINE='^\([^\n]*\).*$'
local -r LAST_LINE='^.*\n\(.*\)$'
local N_ABS=$(( N < 0 ? -N : N ))
local I=$(( N_ABS - 2 )) # Note: N_ABS >= 2 due to "elif" above
local LINES='.*'
local INIT='N'
local FILTER_LINE PARSE_LINE
[ $N_ABS -gt 10 ] &&
log_notice "$FUNCNAME: are you sure you want to skip $N lines?"
while (( I-- )); do
INIT+=';N'
done
while (( N_ABS-- )); do
LINES+='\n.*'
done
if [ $N -gt 0 ]; then
FILTER_LINE=$FIRST_LINE
PARSE_LINE=$LAST_LINE
else
FILTER_LINE=$LAST_LINE
PARSE_LINE=$FIRST_LINE
fi
STRING=$(sed -ne "1 {$INIT;h;n}" \
-e "H;g;s/^.*\\n\\($LINES\)$/\\1/;h" \
-e "s/$FILTER_LINE/\1/" \
-e "$FILTER {g;s/$PARSE_LINE/\1/;$PARSE }")
# Explanation: [1], [2] let hold space always contain the current line
# as well as the previous N lines
# [3] let pattern space contain only the line we test filter regex
# on (i.e. first buffered line on skip > 0, last line on skip < 0)
# [4] if filter regex matches, let pattern space contain the line to
# be parsed and apply parse command
fi
if [ -z "$STRING" ]; then
log_error "$FUNCNAME failed (sed): \"/$1/ ${PARSE//$D//}\" (skip $N)"
log_notice_stack
return $ERR_FATAL
fi
echo "$STRING"
}
# Like parse_all, but get only first match
parse() {
local PARSE=$2
local -i N=${3:-0}
local -r D=$'\001' # Change sed separator to allow '/' characters in regexp
local STRING FILTER
if [ -n "$1" -a "$1" != '.' ]; then
FILTER="\\${D}$1${D}" # /$1/
else
[ $N -eq 0 ] || return $ERR_FATAL
fi
[ '^' = "${PARSE:0:1}" ] || PARSE="^.*$PARSE"
[ '$' = "${PARSE:(-1):1}" ] || PARSE+='.*$'
PARSE="s${D}$PARSE${D}\1${D}p" # s/$PARSE/\1/p
if [ $N -eq 0 ]; then
# Note: This requires GNU sed (which is assumed by Plowshare4)
#STRING=$(sed -ne "$FILTER {$PARSE;ta;b;:a;q;}")
STRING=$(sed -ne "$FILTER {$PARSE;T;q;}")
elif [ $N -eq 1 ]; then
#STRING=$(sed -ne ":a $FILTER {n;h;$PARSE;tb;ba;:b;q;}")
STRING=$(sed -ne ":a $FILTER {n;$PARSE;Ta;q;}")
elif [ $N -eq -1 ]; then
#STRING=$(sed -ne "$FILTER {g;$PARSE;ta;b;:a;q;}" -e 'h')
STRING=$(sed -ne "$FILTER {g;$PARSE;T;q;}" -e 'h')
else
local -r FIRST_LINE='^\([^\n]*\).*$'
local -r LAST_LINE='^.*\n\(.*\)$'
local N_ABS=$(( N < 0 ? -N : N ))
local I=$(( N_ABS - 2 ))
local LINES='.*'
local INIT='N'
local FILTER_LINE PARSE_LINE
[ $N_ABS -gt 10 ] &&
log_notice "$FUNCNAME: are you sure you want to skip $N lines?"
while (( I-- )); do
INIT+=';N'
done
while (( N_ABS-- )); do
LINES+='\n.*'
done
if [ $N -gt 0 ]; then
FILTER_LINE=$FIRST_LINE
PARSE_LINE=$LAST_LINE
else
FILTER_LINE=$LAST_LINE
PARSE_LINE=$FIRST_LINE
fi
# Note: Need to "clean" conditionnal flag after s/$PARSE_LINE/\1/
STRING=$(sed -ne "1 {$INIT;h;n}" \
-e "H;g;s/^.*\\n\\($LINES\)$/\\1/;h" \
-e "s/$FILTER_LINE/\1/" \
-e "$FILTER {g;s/$PARSE_LINE/\1/;ta;:a;$PARSE;T;q;}")
fi
if [ -z "$STRING" ]; then
log_error "$FUNCNAME failed (sed): \"/$1/ ${PARSE//$D//}\" (skip $N)"
log_notice_stack
return $ERR_FATAL
fi
echo "$STRING"
}
==== parse_json ====
# Simple and limited JSON parsing
#
# Notes:
# - Single line parsing oriented (user should strip newlines first): no tree model
# - Array and Object types: basic poor support (depth 1 without complex types)
# - String type: no support for escaped unicode characters (\uXXXX)
# - No non standard C/C++ comments handling (like in JSONP)
# - If several entries exist on same line: last occurrence is taken, but:
# consider precedence (order of priority): number, boolean/empty, string.
# - If several entries exist on different lines: all are returned (it's a parse_all_json)
#
# $1: variable name (string)
# $2: (optional) preprocess option. Accepted values are:
# - "join": make a single line of input stream.
# - "split": split input buffer on comma character (,).
# stdin: JSON data
# stdout: result
parse_json() {
local -r NAME="\"$1\"[[:space:]]*:[[:space:]]*"
local STRING PRE
local -r END='\([,}[:space:]].*\)\?$'
if [ "$2" = 'join' ]; then
PRE="tr -d '\n\r'"
elif [ "$2" = 'split' ]; then
PRE=sed\ -e\ 's/,[[:space:]]*\(["{]\)/\n\1/g'
else
PRE='cat'
fi
# Note: "ta;:a" is a trick for cleaning conditionnal flag
STRING=$($PRE | sed \
-ne "/$NAME\[/{s/^.*$NAME\(\[[^]]*\]\).*$/\1/;ta;:a;s/^\[.*\[//;t;p;q;}" \
-ne "/$NAME{/{s/^.*$NAME\({[^}]*}\).*$/\1/;ta;:a;s/^{.*{//;t;p;q;}" \
-ne "s/^.*$NAME\(-\?\(0\|[1-9][[:digit:]]*\)\(\.[[:digit:]]\+\)\?\([eE][-+]\?[[:digit:]]\+\)\?\)$END/\1/p" \
-ne "s/^.*$NAME\(true\|false\|null\)$END/\1/p" \
-ne "s/\\\\\"/\\\\q/g;s/^.*$NAME\"\([^\"]*\)\"$END/\1/p")
if [ -z "$STRING" ]; then
log_error "$FUNCNAME failed (json): \"$1\""
log_notice_stack
return $ERR_FATAL
fi
# Translate two-character sequence escape representations
STRING=${STRING//\\\//\/}
STRING=${STRING//\\\\/\\}
STRING=${STRING//\\q/\"}
STRING=${STRING//\\b/$'\b'}
STRING=${STRING//\\f/$'\f'}
STRING=${STRING//\\n/$'\n'}
STRING=${STRING//\\r/$'\r'}
STRING=${STRING//\\t/ }
echo "$STRING"
}
===== plowshare for mediafire =====
==== mediafire login ====
=== mediafire web ===
== login form ==
How do you want to log in?
Invalid email or password.
=> post url: /dynamic/client_login/mediafire.php
== debug post login send ==
* Header
(Request-Line) POST /dynamic/client_login/mediafire.php HTTP/1.1
Host www.mediafire.com
* Post data
login_email itanhchi@yahoo.com
login_pass xxxxxxx
login_remember on
== debug reponse code for login ==
compare reponse OK and response fail
{{:linux:mediafire-reponselogin.png|}}\\
var et= 15
=> Login OK\\
var fp='itanhchi@yahoo.com';
=> username
=== mediafire login code ===
mediafire_download() {
local -r COOKIE_FILE=$1
local -r BASE_URL='http://www.mediafire.com'
local FILE_ID URL PAGE JSON JS_VAR
if [ -n "$AUTH_FREE" ]; then
mediafire_login "$AUTH_FREE" "$COOKIE_FILE" "$BASE_URL" || return
fi
.................
}
# Static function. Proceed with login
# $1: authentication
# $2: cookie file
# $3: base URL
mediafire_login() {
local -r AUTH_FREE=$1
local -r COOKIE_FILE=$2
local -r BASE_URL=$3
local -r ENC_BASE_URL=$(uri_encode_strict <<< "$BASE_URL/")
local LOGIN_DATA PAGE CODE NAME
# Make sure we have "ukey" cookie (mandatory)
curl -c "$COOKIE_FILE" -o /dev/null "$BASE_URL"
# Notes: - "login_remember=on" not required
# - force SSLv3 to avoid problems with curl using OpenSSL/1.0.1
LOGIN_DATA='login_email=$USER&login_pass=$PASSWORD&submit_login=Login+to+MediaFire'
PAGE=$(post_login "$AUTH_FREE" "$COOKIE_FILE" "$LOGIN_DATA" \
"${BASE_URL/#http/https}/dynamic/login.php?popup=1" \
-b "$COOKIE_FILE" --sslv3 --referer "$BASE_URL") || return
# Note: Cookies "user" and "session" get set on successful login, "skey" is changed"
CODE=$(echo "$PAGE" | parse 'var et' 'var et= \(-\?[[:digit:]]\+\);') || return
NAME=$(echo "$PAGE" | parse 'var fp' "var fp='\([^']\+\)';") || return
# Check for errors
# Note: All error codes are explained in page returned by server.
if [ $CODE -ne 15 ]; then
log_debug "Remote error: $ERR"
return $ERR_LOGIN_FAILED
fi
log_debug "Successfully logged in as member '$NAME'"
}
==== mediafire download ====
* download.sh
local FUNCTION=${MODULE}_download
$FUNCTION "$COOKIE_FILE" "$URL_ENCODED" >"$DRESULT" || DRETVAL=$?
* mediafire.sh
# Output a mediafire file download URL
# $1: cookie file
# $2: mediafire.com url
# stdout: real file download link
mediafire_download() {
local -r COOKIE_FILE=$1
local -r BASE_URL='http://www.mediafire.com'
local FILE_ID URL PAGE JSON JS_VAR
if [ -n "$AUTH_FREE" ]; then
mediafire_login "$AUTH_FREE" "$COOKIE_FILE" "$BASE_URL" || return
fi
FILE_ID=$(mediafire_extract_id "$2") || return
if ! mediafire_is_file_id "$FILE_ID"; then
log_error 'This is a folder link. Please use plowlist!'
return $ERR_FATAL
fi
# Only get site headers first to capture direct download links
URL=$(curl --head "$BASE_URL/?$FILE_ID" | grep_http_header_location_quiet) || return
case "$URL" in
# no redirect, normal download
'')
URL="$BASE_URL/?$FILE_ID"
;;
/download/*)
URL="$BASE_URL$URL"
;;
http://*)
log_debug 'Direct download'
echo "$URL"
return 0
;;
*errno=999)
return $ERR_LINK_NEED_PERMISSIONS
;;
*errno=320|*errno=378)
return $ERR_LINK_DEAD
;;
*errno=*)
log_error "Unexpected remote error: ${URL#*errno=}"
return $ERR_FATAL
esac
PAGE=$(curl -b "$COOKIE_FILE" -c "$COOKIE_FILE" "$URL" | break_html_lines) || return
# Invalid or Deleted File.
match 'Invalid or Deleted File' "$PAGE" && return $ERR_LINK_DEAD
# handle captcha (reCaptcha or SolveMedia) if there is one
if match '
==== mediafire upload ====
Refer: https://www.mediafire.com/developers/upload.php
=== upload code ===
* [upload.sh]
FUNCTION=${MODULE}_upload
$FUNCTION "$UCOOKIE" "$LOCALFILE" \
"$DESTFILE" >"$URESULT" || URETVAL=$?
* [mediafire.sh]
# Upload a file to mediafire using official API.
# https://www.mediafire.com/developers/upload.php
# $1: cookie file (unused here)
# $2: input file (with full path)
# $3: remote filename
# stdout: mediafire.com download link
mediafire_upload() {
local -r COOKIE_FILE=$1
local -r FILE=$2
local -r DEST_FILE=$3
local -r BASE_URL='https://www.mediafire.com'
local SESSION_TOKEN JSON RES KEY_ID UPLOAD_KEY QUICK_KEY FOLDER_KEY
# Sanity checks
[ -n "$AUTH_FREE" ] || return $ERR_LINK_NEED_PERMISSIONS
if [ -n "$ASYNC" ] && ! match_remote_url "$FILE"; then
log_error 'Cannot upload local files asynchronously.'
return $ERR_BAD_COMMAND_LINE
fi
if [ -n "$ASYNC" -a \( -n "$DESCRIPTION" -o -n "$LINK_PASSWORD" -o \
-n "$PRIVATE_FILE" \) ] ; then
log_error 'Advanced options not available for asynchronously uploaded files.'
return $ERR_BAD_COMMAND_LINE
fi
# FIXME
if [ -z "$ASYNC" ] && match_remote_url "$FILE"; then
log_error 'Synchronous remote upload not implemented.'
return $ERR_BAD_COMMAND_LINE
fi
SESSION_TOKEN=$(mediafire_api_get_session_token "$AUTH_FREE" "$BASE_URL") || return
log_debug "Session Token: '$SESSION_TOKEN'"
# API bug
if [ "${#DEST_FILE}" -lt 3 ]; then
log_error 'Filenames less than 3 characters cannot be uploaded. Mediafire API bug? This is not a plowshare bug!'
fi
if [ -n "$FOLDER" ]; then
FOLDER_KEY=$(mediafire_check_folder "$SESSION_TOKEN" "$BASE_URL" "$FOLDER") || return
fi
# Check for duplicate name
JSON=$(curl --get -d "session_token=$SESSION_TOKEN" -d "filename=$DEST_FILE" \
-d 'response_format=json' \
-d 'action_on_duplicate=keep' \
${FOLDER:+-d "upload_folder_key=$FOLDER_KEY"} \
"$BASE_URL/api/upload/pre_upload.php") || return
RES=$(parse_json result <<<"$JSON") || return
if [ "$RES" != 'Success' ]; then
local NUM MSG
NUM=$(parse_json_quiet error <<<"$JSON")
MSG=$(parse_json_quiet message <<<"$JSON")
log_error "Unexpected remote error (pre_upload): $NUM, '$MSG'"
return $ERR_FATAL
fi
# "duplicate_name":"yes","duplicate_quickkey":"2xrys3f97a9t9ce"
# Note: "duplicate_name" is not always returned ???
QUICK_KEY=$(parse_json_quiet 'duplicate_quickkey' <<<"$JSON") || return
if [ -n "$QUICK_KEY" ]; then
if [ -n "$UNIQUE_FILE" ]; then
log_error 'Duplicated filename. Return original quickkey.'
echo "$BASE_URL/?$QUICK_KEY"
return 0
else
log_debug 'a file with the same filename already exists. File will be renamed.'
fi
fi
# "used_storage_size":"10438024","storage_limit":"53687091200","storage_limit_exceeded":"no"
RES=$(parse_json storage_limit_exceeded <<<"$JSON") || return
if [ "$RES" = 'yes' ]; then
log_error 'Storage limit exceeded. Abort.'
return $ERR_SIZE_LIMIT_EXCEEDED
fi
# Start upload
if match_remote_url "$FILE"; then
JSON=$(curl -d "session_token=$SESSION_TOKEN" \
-d "filename=$DESTFILE" \
-d 'response_format=json' \
--data-urlencode "url=$FILE" \
${FOLDER:+"-d folder_key=$FOLDER_KEY"} \
"$BASE_URL/api/upload/add_web_upload.php") || return
KEY_ID='upload_key'
else
local FILE_SIZE
FILE_SIZE=$(get_filesize "$FILE") || return
JSON=$(curl_with_log -F "Filedata=@$FILE;filename=$DESTFILE" \
--header "x-filename: $DEST_FILE" \
--header "x-size: $FILE_SIZE" \
"$BASE_URL/api/upload/upload.php?session_token=$SESSION_TOKEN&action_on_duplicate=keep&response_format=json${FOLDER:+"&uploadkey=$FOLDER_KEY"}") || return
KEY_ID='key'
fi
# Check for errors
RES=$(parse_json result <<<"$JSON") || return
if [ "$RES" != 'Success' ]; then
local NUM MSG
NUM=$(parse_json_quiet error <<<"$JSON")
MSG=$(parse_json_quiet message <<<"$JSON")
log_error "Unexpected remote error (upload): $NUM, '$MSG'"
return $ERR_FATAL
fi
UPLOAD_KEY=$(parse_json "$KEY_ID" <<< "$JSON") || return
log_debug "polling for status update (with key $UPLOAD_KEY)"
QUICK_KEY=''
# Wait for upload to finish
if match_remote_url "$FILE"; then
[ -n "$ASYNC" ] && return $ERR_ASYNC_REQUEST
else
for N in 3 3 2 2 2; do
wait $N seconds || return
JSON=$(curl --get -d "session_token=$SESSION_TOKEN" \
-d 'response_format=json' -d "key=$UPLOAD_KEY" \
"$BASE_URL/api/upload/poll_upload.php") || return
RES=$(parse_json result <<<"$JSON") || return
if [ "$RES" != 'Success' ]; then
log_error "FIXME '$JSON'"
return $ERR_FATAL
fi
# No more requests for this key
RES=$(parse_json status <<<"$JSON") || return
if [ "$RES" = '99' ]; then
QUICK_KEY=$(parse_json quickkey <<<"$JSON") || return
break
fi
done
fi
if [ -z "$QUICK_KEY" ]; then
local MSG ERR
MSG=$(parse_json_quiet description <<<"$JSON")
ERR=$(parse_json_quiet fileerror <<<"$JSON")
log_error "Bad status $RES: '$MSG'"
log_debug "fileerror: '$ERR'"
return $ERR_FATAL
fi
if [ -n "$DESCRIPTION" -o -n "$PRIVATE_FILE" ]; then
JSON=$(curl -d "session_token=$SESSION_TOKEN" \
-d "quick_key=$QUICK_KEY" -d 'response_format=json' \
${DESCRIPTION:+-d "description=$DESCRIPTION"} \
${PRIVATE_FILE:+-d 'privacy=private'} \
"$BASE_URL/api/file/update.php") || return
RES=$(parse_json result <<<"$JSON")
if [ "$RES" != 'Success' ]; then
log_error 'Could not set description/hide file.'
fi
fi
# Note: Making a file private removes its password...
if [ -n "$LINK_PASSWORD" ]; then
JSON=$(curl -d "session_token=$SESSION_TOKEN" \
-d "quick_key=$QUICK_KEY" -d 'response_format=json' \
-d "password=$LINK_PASSWORD" \
"$BASE_URL/api/file/update_password.php") || return
RES=$(parse_json result <<<"$JSON")
if [ "$RES" != 'Success' ]; then
log_error 'Could not set password.'
fi
fi
echo "$BASE_URL/?$QUICK_KEY"
}
=== Analyser code ===
* Step1: Get session token using $BASE_URL/api/user/get_session_token.php
SESSION_TOKEN=$(mediafire_api_get_session_token "$AUTH_FREE" "$BASE_URL") || return
* Step2: Check for duplicate name using $BASE_URL/api/upload/pre_upload.php
JSON=$(curl --get -d "session_token=$SESSION_TOKEN" -d "filename=$DEST_FILE" \
-d 'response_format=json' \
-d 'action_on_duplicate=keep' \
${FOLDER:+-d "upload_folder_key=$FOLDER_KEY"} \
"$BASE_URL/api/upload/pre_upload.php") || return
* Step3: Start upload
* using remote upload $BASE_URL/api/upload/add_web_upload.php
if match_remote_url "$FILE"; then
JSON=$(curl -d "session_token=$SESSION_TOKEN" \
-d "filename=$DESTFILE" \
-d 'response_format=json' \
--data-urlencode "url=$FILE" \
${FOLDER:+"-d folder_key=$FOLDER_KEY"} \
"$BASE_URL/api/upload/add_web_upload.php") || return
* Or using upload file with $BASE_URL/api/upload/upload.php
else
local FILE_SIZE
FILE_SIZE=$(get_filesize "$FILE") || return
JSON=$(curl_with_log -F "Filedata=@$FILE;filename=$DESTFILE" \
--header "x-filename: $DEST_FILE" \
--header "x-size: $FILE_SIZE" \
"$BASE_URL/api/upload/upload.php?session_token=$SESSION_TOKEN&action_on_duplicate=keep&response_format=json${FOLDER:+"&uploadkey=$FOLDER_KEY"}") || return
KEY_ID='key'
fi